Exams, Demand, and a Short Lesson in Controlling Public Discourse

by Dan H

Dan observes that exams may or may not be getting easier, but journalists are as disingenuous as ever
~
Edited to add: As somebody points out in the comments below, I'm kind of a dick to journalists in the article and for that I do genuinely apologise, I make some unfair overgeneralisations about the profession which were cheap and unsupported. As always when this sort of thing happens, I am leaving the text as originally published, because I believe it is important to own your mistakes.

Ofqual, the UK regulatory body overseeing qualifications, published a report last week which – according to journalists across the country “proves” that exams “are getting easier” which is leading to “falling standards”. The BBC opens with the only-slightly-sensationalist “Science and geography exams easier, Ofqual says”, while the print edition of the Daily Mail opened with “It's Official! Exams HAVE Gotten Easier”, and Melanie Phillips (a woman who despite, as far as I can tell, being a career journalist and never remotely involved in Education, considers herself so qualified to comment on the subject that she wrote a book about it) declares that “Everyone knows exams have got easier over the last decade. The only surprise is that the educational establishment has admitted it.

Before we do anything else, let's look at what the report actually said:


  • It said that GCSE biology had become easier because it included more multiple choice questions.

  • It said that A-Level biology exams set by WJEC (the Welsh Joint Education Committee) specifically had too many structured and short-answer questions.

  • It said that the removal of coursework from the geography syllabus “reduced demand”. This is interesting in that a lot of people tend to criticise coursework as a “soft option”.

  • It said that A-Level Chemistry questions had become easier as a result of questions being structured differently.



And that's it. Or at least, that's the specific information given in the newspaper reports, none of which cite the actual report all of this information comes from (one of the great ironies of Education reporting is that the academic skills which universities – quite rightly – argue that students should learn in school are skills which the journalists who parrot those arguments show no evidence of having mastered). I suspected that they might be referring to the correspondence between Michael Gove and Glenys Stacey (here and here concerning the contents of this report, but having read the hundred and sixty-nine page report in detail it appears not to be the source.

This article is going to be in several parts, because I do want to talk a bit about the A-Level system (because the truth, despite what Melanie Phillips and her ilk believe, is rather more complicated than “hard good, easy bad”) but I want to start off talking about the straight up shoddiness of the reporting. One hardly expects the Mail to be a bastion of journalistic integrity, but even comparatively respectable outlets like the beeb and the Telegraph failed to make the seemingly obvious distinction between “some exams set by some boards have become easier in some ways” and “exams are getting easier” (and in the case of the good Ms Phillips, they failed to make the distinction between “exams are getting easier,” “the exams are getting easier because of political correctness gone mad,” and “the sky is falling”). This is as disingenuous as, well to be honest it's as disingenuous as pretty much every newspaper report drawn from an academic study I've ever seen. Graeme Paton in the Telegraph at least keeps the accusations to a minimum and points out (as the BBC and the Mail failed to) that most of the exams the report was about were no longer even being taught (the study focused on exams set between 2003 and 2008, which was before the last big A-Level shakeup). This observation did not stop the article from being subtitled: “GCSEs and A-levels in key subjects have become easier following a 10-year dumbing down of exam papers”.

A small word of advice to journalists. If you are going to publish ill-informed articles which reduce sophisticated studies produced by professional bodies investigating complex issues into one-line soundbites, you might want to avoid using the phrase “dumbing down” because it makes you look just the tiniest bit hypocritical.

The annoying thing about the way this whole situation has been reported is that it glosses over some questions that are actually rather interesting. Those questions being “are exams getting easier” (and whatever Melanie Phillips may believe “everybody knows they are” does not in fact constitute evidence), “what does it mean for an exam to get easier” and finally and perhaps most importantly “if exams are getting easier, is that necessarily a bad thing?”

Question the First: Have Exams Got Easier?

Short answer: Probably.

Long answer: OCR (Oxford, Cambridge, and RSA assessments) is one of several exam boards offering A-Levels in England, and it alone offers more than sixty A-Level subjects, and the syllabus for each of those subjects will be different for each exam board that offers them. We're talking about hundreds of exams being sat by thousands of candidates, twice a year, for decades. If every exam was measurably easier than the exam set in the previous session, the whole system would have become farcical years ago. The number of students achieving A-grades at A-level increased by six percentage points in the last ten years, some of that might reflect a change in the demand of exams, some of it might reflect a change in the quality of teaching, and some of it might reflect the simple fact that every year a syllabus remains active, there are another year's worth of past papers to look at.

It is certainly true that things get taken off of syllabuses, and it is equally true that things get put on to syllabuses. I honestly and sincerely believe that the changes made to the OCR Physics-A A-Level syllabus in 2008 (first examination in 2009) – this being the syllabus I teach, and have been teaching since 2006 – made it harder in several ways, not all of which actually made the exam a better test of the students' understanding of Physics.

Over the next few years, I expect the exams to get “easier” in several senses. Firstly, and most importantly, I expect the exams to become easier for me to prepare my students for simply because I expect that I, personally, will understand the syllabus better. Similarly, I will expect the examiners to have settled into their role better, and to be providing a more consistent quality of exam, to provide more predictable questions, and to make fewer flat-out mistakes (one question this January asked the students to describe the limitations of radiocarbon dating, but did not credit answers which assumed the use of a mass spectrometer).

So yes, exams are likely to carry on getting easier in real terms. But this is very different from what people are complaining about, which is exams getting easier in some objective, absolute way. There is (peculiarly for the internet) some quite interesting discussion of this issue in the comments section of this Bad Science article from 2010. I was particularly impressed by the comment from “GerryP” who, comparing his experiences of A-Level to those of his sons, observes:


Comparing my examinations with theirs is of course difficult but there are clear differences. Modern exam questions are longer and more structured, they take the student through the answer in a much more structured way. The old A Levels were more predictable, we bought books of past exam papers and spent hours working through the past questions. Our physics teacher would post a list of 'predicted' exam topics a couple of weeks before the exam.


This little personal anecdote nicely highights the problem with the received idea that exams have “obviously” got easier because you “just have to look”. An “easy” question answered sight-unseen is often rather trickier than a “hard” question for which you have been drilled.

Another, rather less interesting commenter links to this O-Level paper as evidence of the shocking decline in standards (the same commenter later loudly asks “WHAT IS THE POINT OF AN EXAM FOR THICK PEOPLE” - which seems to be a depressingly common attitude, surely the answer is that it's the same as the point of an exam for clever people). It's easy to look at that sort of exam paper and imagine that because it is clearly “harder” than a current GCSE exam it is therefore a better test of student ability.

But this fails to recognise the context in which the exam was taught. Obviously I didn't go through that system myself, but looking at that O-level paper, every single question on it seems to involve the student applying a specific mathematical technique which they have learned by rote. This might be hard to do, but it isn't really learning in any sense that I would consider useful.

The same site contains a 1953 A-Level Physics paper. I'm a bit more qualified to talk about this than the Maths paper, because I both studied and teach this subject and all I can say is: holy shit it's all definitions. Was it harder to get an “A” on this paper than on a modern paper? Probably. Would getting an A on this paper require you to show a greater understanding of Physics than getting an A on a modern paper? It most assuredly would not.

The 1953 paper is “hard” but it's hard for all the wrong reasons. The questions test nothing but recall and basic mathematics, but they are difficult to answer correctly because they are often vaguely worded or simply poorly laid out. It is true that the exam tests a number of things that are not on the current syllabus (like a very, very small amount of A.C. Circuit theory) but it is equally true that the exam has enormous, glaring omissions (the lack of nuclear physics can be explained by that subject being relatively new, but there's also a peculiar lack of mechanics). None of the questions require the student to apply their knowledge in any kind of unfamiliar context – a vital skill for a scientist – and there is remarkably little quantitative content.

Which brings me rather neatly to the next important question.

Question the Second: What Is Difficulty?

The thing is, I do actually think exams have got easier. I just don't necessarily think that's a bad thing. Contrary to what Melanie Phillips may believe, that isn't because I'm some wet hand-wringing liberal who believes (as she puts it the book she wrote about an area in which she has never worked) that “all must have prizes” but rather because I'm an educational professional who knows quite how easy it is to write a hard exam question, and quite how hard it is to write an easy one.

Take, for example, the following question: “What is the boiling point of water.”

This question appeared on one of our in-house exams a year or so ago, and of course all of our students immediately answered “100 degrees centigrade” and were promptly awarded zero marks because the answer the exam-setter had been looking for was “the temperature at which liquid water becomes steam.”

Something you might have noticed if you read through all of those infuriating standards-are-slipping articles is that while the articles themselves talked about exams getting “easier” the word that was used whenever the original report was quoted was “demand.” Demand and difficulty are two very different things. Vague, ambiguously worded questions, questions that are not well laid out, or that leave the student otherwise unclear about what they are being asked make an exam more difficult, but they do not increase demand.

For example, the first question on the 1953 Physics paper reads:


Show that small vertical oscillations of a mass suspended by a light spring from a rigid support are simple harmonic. What condition must be fulfilled if the oscillation is to remain simple harmonic when the oscillation is no longer small? Why do the oscillations gradually decrease in amplitude as the mass continues to oscillate?
Describe an experiment in which a loaded spring is used to determine the acceleration due to gravity.
A spring is such that a load of 100gm stretches it by 20cm. When a load of 60gm is attached to the spring and set oscillating vertically there are 50 oscillations is 34.7 seconds. Calculate the acceleration due to gravity.


This question is not subdivided in any way, except by paragraph breaks. The first paragraph encompasses three questions, one of which actually contradicts the first two (the question “why do the oscillations gradually decrease in amplitude as the mass continues to oscillate?” assumes the existence of damping, but damped motion is not simple harmonic). The question could be made easier (and crucially no less demanding) by breaking it down into sections, rewording it so that it wasn't physically incorrect, and perhaps including a diagram to more clearly show the situation being described.

This, I suspect, is around the point where a certain sort of person starts to complain about “dumbing down”. Why, after all, should we bother to write exam questions just to make things easier for people who are too stupid to read them properly? Leaving aside, for the moment, the fact that I suspect the very dry, very formal, very wall-of-text format of the 1953 A-Level almost certainly disadvantages students from working-class backgrounds, as well as students for whom English is an additional language (and I confess that whenever people complain about falling A-Level standards, I often suspect that the real complaint is that blacks and poor people have started taking them), it's also just plain lazy.

The problem here is that most people from outside the world of education have a very narrow, very simplistic view of what an exam is for. Like the commenter from the Bad Science post, people really do want to know WHATS THE POINT OF AN EXAM FOR THICK PEOPLE? They think that the job of an exam is to let the “best” people show how good they are, and fuck everybody else. From this narrow point of view, making exams arbitrarily “hard” is indeed desirable. The more pointless, arbitrary hoops you make people jump through, the more likely it is that only the “best” people will get through those hoops. Where the “best” people are defined (circularly) as the ones who were most able to get through the pointless, arbitrary hoops in the first place.

This is just not what exams are for, and the notion that it is what they are for is elitist bullshit. “Elitist,” by the way, being one of those words (like arrogant) that stupid people think is secretly cool when it isn't. People think “arrogant” means “is awesome and knows it” when it actually means “is shit and doesn't know it,” while they think “elitist” means “tells it like it is and doesn't coddle people” when it actually means “is mortally fucking terrified of having to actually compete with people on a level playing field.” People want exams to be “hard” so that they can continue to feel superior to people who are less good at exams than they are. Monolingual English Speakers baulk at the idea that we could rephrase exam questions to make them more accessible to people whose native language is not English because we enjoy looking down on people who speak our language only-slightly-less-well than they speak their own.

Exams are there to test your knowledge of a subject. They're not perfect – in fact they're a lot like Democracy, they're the worst system except for all the others. Making exams easier is completely appropriate if the original source of the difficulty was something that bore no relationship to the subject of the exam. A couple of months ago there was a big stink about the fact that in the American SATs, exam essays were not penalised for factual errors, but this again was a perfectly sensible decision. If you want to test a student's ability to write, you don't dock them marks for writing about things that aren't true – nobody was penalised in their French Oral for saying their father was a Doctor when he was really a Taxi Driver.

There is, of course, a lot of grey area here, mostly around the distinction between structured and unstructured questions. It is certainly true that working through multi-part questions without guidance is a useful analytical skill, but it is a specific skill and should be tested only by those questions that are trying to test it. There is as much difference between a deliberately unstructured question and a badly structured one as there is between requiring my students to research a topic independently and simply failing to show up for work.

Again it's interesting to bring this all back to the 1953 exam paper, which is supposed to be our gold standard for a “good” “hard” A-Level exam. Again, I observe that the vast majority of what it tests is simple recall (which most educators and halfway sensible people agree is by far the least demanding mode of assessment). It would be an unmitigated disaster for education if we were to allow the “harder is better” doctrine to be taken even remotely seriously, because all we would wind up doing is turning exams into the equivalent of a badly written text adventure, where success depends entirely on your ability to read the mind of the person who wrote the damned thing.

Question the Third: Does it Matter if Exams get Easier?

The title of this article promises, amongst other things, “A Short Lesson in Controlling Public Discourse.” The “exams are getting easier” doctrine is a fine example of a particularly pernicious technique for controlling the public debate about a particular topic. The technique works as follows:


  1. Decide you want things to be a particular way

  2. Declare that things are now less the way you like than they were in the past

  3. Get angry about it



Immediately, the debate becomes about one thing and one thing only: whether your observation that things are getting less like you want them to be is correct. Meanwhile everybody – even people who really should know better – concedes your unstated assumption that it is desirable for things to be the way you want them.

The job of exams is not to be hard, it is not to be easy, it is most certainly not to provide the top five percent of students with a flashy qualification they can use to get into Oxford. The job of exams is to test learning and produce adequate differentiation across the full range of candidates. This, amongst other things, is why we need what that charming individual called “Exams for Thick People”. The job of an exam is not to let clever people show off, it is to actually assess people, and that means differentiating between D and E grade candidates just as much as it means differentiating between A and B grade candidates. Complaining that exams are getting easier is just a socially acceptable way of complaining that we're no longer restricting education to a privileged elite.

That isn't to say that there aren't problems with the current system. The current A-level system does not, by itself, provide adequate differentiation between the very best students and the merely very good. But this problem cuts far deeper than the difference between “easy” and “hard” exams, and this problem has always existed. At the highest levels exams simply become a bad method of distinguishing between candidates. There is, to put it simply, a reason that Oxford interviews people.

It is, as I said at the start, almost certainly true that exams are getting easier in some ways, but this is often for quite unexpected reasons. Perhaps most interestingly, the recent Ofqual report on the viability of the A-level suggests that the recent proliferation of multiple-choice questions and short structured answers over longer essay-style questions has more to do with making exams easier to administrate than anything else. There is simply a shortage of skilled examiners (students, and adults who work outside of education, tend to forget that people actually have to mark exams) which means that it is impractical to mark large volumes of complex written questions.

Perhaps what I find most infuriating about the “exams are getting easier” mantra, apart from the fact that it masterfully deploys a rather manipulative rhetorical strategy, is that it is based almost entirely on the very obsession with letter-grades that it condemns. Nobody bothers to look at actual questions, or actual syllabus content (at least not until Ofqual comes along and does it for them), nobody pays attention to what students are actually learning. People just look at a 1.7% increase in the number of A-grades and insist that it is incontrovertible evidence that “standards” are slipping. Just A-grades of course, because people really do not seem to believe that grades below A exist at all, or if they do they certainly don't seem to think that the people who get them are at all important (we're back to "exams for thick people" again). Is it any wonder that the number of A-grades grows year-on-year when in the eyes of half the population any grade below an A is a fail?

Would some people who got Cs at A-level in 1953 be better suited for university than people getting As at A-level now? Almost certainly. But I'll let you in on a secret. There are some people getting Cs at A-level right now who are better suited for university than other people getting As at A-level on the same exam. Exams, all exams, GCSEs, A-levels, Cambridge Pre-U, the IB, all of them, are imperfect tools. Worse, they are imperfect tools with no specific purpose – existing partly to help universities select candidates, partly as a qualification for eighteen year old school-leavers, partly as learning for its own sake. They need to impart subject knowledge, independent study skills, and maturity of outlook. They need to prepare students for university or for work, in England or abroad.

And different aspects of the function of A-levels are important to different people – my brother (a mathematician) insists that maths teaching in the UK is flawed because it focuses too much on making students rote learn specific techniques, instead of teaching them an understanding of the fundamental properties of number. The average UK journalist thinks that maths teaching in the UK is flawed because school leavers are bad at mental arithmetic and haven't rote learned enough specific techniques (the strangest article I've seen about maths teaching is this one in the Telegraph which somehow manages to cite the fact that thirty percent of parents said they were unable to do their kids' maths homework as evidence that maths teaching was getting worse).

There is no way on Earth that a single letter-grade can summarise the extent to which a student has internalised all of the things that they are required to learn over the course of their A-Levels or, for that matter, their entire school career. Making exams “harder” would not make that single letter-grade convey more information, or make it correlate better with all the dozens of different things it is supposed to correlate with. It certainly won't tell us who the “best” people are and it is absurd to suggest that it should.

It's finals season in Oxford at the moment, and throughout the city students are sitting their exams, and a good number of them are going to get better results than I did. And of course I would find it comforting to pretend that because I did my A-levels fifteen years earlier than them, that I am somehow smarter or better educated than they are, but it simply does not work that way, no matter how much I might want to.

The simple fact is that systems of education are not commensurable. Judging the value of a person's education by how closely it matches your own is the very definition of small-mindedness and unbecoming of a person who considers themselves educated.
Themes: Topical
~

bookmark this with - facebook - delicious - digg - stumbleupon - reddit

~

Show / Hide Comments -- More in May 2012