Exams, Demand, and a Short Lesson in Controlling Public Discourse

by Dan H

Dan observes that exams may or may not be getting easier, but journalists are as disingenuous as ever
~
Edited to add: As somebody points out in the comments below, I'm kind of a dick to journalists in the article and for that I do genuinely apologise, I make some unfair overgeneralisations about the profession which were cheap and unsupported. As always when this sort of thing happens, I am leaving the text as originally published, because I believe it is important to own your mistakes.

Ofqual, the UK regulatory body overseeing qualifications, published a report last week which – according to journalists across the country “proves” that exams “are getting easier” which is leading to “falling standards”. The BBC opens with the only-slightly-sensationalist “Science and geography exams easier, Ofqual says”, while the print edition of the Daily Mail opened with “It's Official! Exams HAVE Gotten Easier”, and Melanie Phillips (a woman who despite, as far as I can tell, being a career journalist and never remotely involved in Education, considers herself so qualified to comment on the subject that she wrote a book about it) declares that “Everyone knows exams have got easier over the last decade. The only surprise is that the educational establishment has admitted it.

Before we do anything else, let's look at what the report actually said:


  • It said that GCSE biology had become easier because it included more multiple choice questions.

  • It said that A-Level biology exams set by WJEC (the Welsh Joint Education Committee) specifically had too many structured and short-answer questions.

  • It said that the removal of coursework from the geography syllabus “reduced demand”. This is interesting in that a lot of people tend to criticise coursework as a “soft option”.

  • It said that A-Level Chemistry questions had become easier as a result of questions being structured differently.



And that's it. Or at least, that's the specific information given in the newspaper reports, none of which cite the actual report all of this information comes from (one of the great ironies of Education reporting is that the academic skills which universities – quite rightly – argue that students should learn in school are skills which the journalists who parrot those arguments show no evidence of having mastered). I suspected that they might be referring to the correspondence between Michael Gove and Glenys Stacey (here and here concerning the contents of this report, but having read the hundred and sixty-nine page report in detail it appears not to be the source.

This article is going to be in several parts, because I do want to talk a bit about the A-Level system (because the truth, despite what Melanie Phillips and her ilk believe, is rather more complicated than “hard good, easy bad”) but I want to start off talking about the straight up shoddiness of the reporting. One hardly expects the Mail to be a bastion of journalistic integrity, but even comparatively respectable outlets like the beeb and the Telegraph failed to make the seemingly obvious distinction between “some exams set by some boards have become easier in some ways” and “exams are getting easier” (and in the case of the good Ms Phillips, they failed to make the distinction between “exams are getting easier,” “the exams are getting easier because of political correctness gone mad,” and “the sky is falling”). This is as disingenuous as, well to be honest it's as disingenuous as pretty much every newspaper report drawn from an academic study I've ever seen. Graeme Paton in the Telegraph at least keeps the accusations to a minimum and points out (as the BBC and the Mail failed to) that most of the exams the report was about were no longer even being taught (the study focused on exams set between 2003 and 2008, which was before the last big A-Level shakeup). This observation did not stop the article from being subtitled: “GCSEs and A-levels in key subjects have become easier following a 10-year dumbing down of exam papers”.

A small word of advice to journalists. If you are going to publish ill-informed articles which reduce sophisticated studies produced by professional bodies investigating complex issues into one-line soundbites, you might want to avoid using the phrase “dumbing down” because it makes you look just the tiniest bit hypocritical.

The annoying thing about the way this whole situation has been reported is that it glosses over some questions that are actually rather interesting. Those questions being “are exams getting easier” (and whatever Melanie Phillips may believe “everybody knows they are” does not in fact constitute evidence), “what does it mean for an exam to get easier” and finally and perhaps most importantly “if exams are getting easier, is that necessarily a bad thing?”

Question the First: Have Exams Got Easier?

Short answer: Probably.

Long answer: OCR (Oxford, Cambridge, and RSA assessments) is one of several exam boards offering A-Levels in England, and it alone offers more than sixty A-Level subjects, and the syllabus for each of those subjects will be different for each exam board that offers them. We're talking about hundreds of exams being sat by thousands of candidates, twice a year, for decades. If every exam was measurably easier than the exam set in the previous session, the whole system would have become farcical years ago. The number of students achieving A-grades at A-level increased by six percentage points in the last ten years, some of that might reflect a change in the demand of exams, some of it might reflect a change in the quality of teaching, and some of it might reflect the simple fact that every year a syllabus remains active, there are another year's worth of past papers to look at.

It is certainly true that things get taken off of syllabuses, and it is equally true that things get put on to syllabuses. I honestly and sincerely believe that the changes made to the OCR Physics-A A-Level syllabus in 2008 (first examination in 2009) – this being the syllabus I teach, and have been teaching since 2006 – made it harder in several ways, not all of which actually made the exam a better test of the students' understanding of Physics.

Over the next few years, I expect the exams to get “easier” in several senses. Firstly, and most importantly, I expect the exams to become easier for me to prepare my students for simply because I expect that I, personally, will understand the syllabus better. Similarly, I will expect the examiners to have settled into their role better, and to be providing a more consistent quality of exam, to provide more predictable questions, and to make fewer flat-out mistakes (one question this January asked the students to describe the limitations of radiocarbon dating, but did not credit answers which assumed the use of a mass spectrometer).

So yes, exams are likely to carry on getting easier in real terms. But this is very different from what people are complaining about, which is exams getting easier in some objective, absolute way. There is (peculiarly for the internet) some quite interesting discussion of this issue in the comments section of this Bad Science article from 2010. I was particularly impressed by the comment from “GerryP” who, comparing his experiences of A-Level to those of his sons, observes:


Comparing my examinations with theirs is of course difficult but there are clear differences. Modern exam questions are longer and more structured, they take the student through the answer in a much more structured way. The old A Levels were more predictable, we bought books of past exam papers and spent hours working through the past questions. Our physics teacher would post a list of 'predicted' exam topics a couple of weeks before the exam.


This little personal anecdote nicely highights the problem with the received idea that exams have “obviously” got easier because you “just have to look”. An “easy” question answered sight-unseen is often rather trickier than a “hard” question for which you have been drilled.

Another, rather less interesting commenter links to this O-Level paper as evidence of the shocking decline in standards (the same commenter later loudly asks “WHAT IS THE POINT OF AN EXAM FOR THICK PEOPLE” - which seems to be a depressingly common attitude, surely the answer is that it's the same as the point of an exam for clever people). It's easy to look at that sort of exam paper and imagine that because it is clearly “harder” than a current GCSE exam it is therefore a better test of student ability.

But this fails to recognise the context in which the exam was taught. Obviously I didn't go through that system myself, but looking at that O-level paper, every single question on it seems to involve the student applying a specific mathematical technique which they have learned by rote. This might be hard to do, but it isn't really learning in any sense that I would consider useful.

The same site contains a 1953 A-Level Physics paper. I'm a bit more qualified to talk about this than the Maths paper, because I both studied and teach this subject and all I can say is: holy shit it's all definitions. Was it harder to get an “A” on this paper than on a modern paper? Probably. Would getting an A on this paper require you to show a greater understanding of Physics than getting an A on a modern paper? It most assuredly would not.

The 1953 paper is “hard” but it's hard for all the wrong reasons. The questions test nothing but recall and basic mathematics, but they are difficult to answer correctly because they are often vaguely worded or simply poorly laid out. It is true that the exam tests a number of things that are not on the current syllabus (like a very, very small amount of A.C. Circuit theory) but it is equally true that the exam has enormous, glaring omissions (the lack of nuclear physics can be explained by that subject being relatively new, but there's also a peculiar lack of mechanics). None of the questions require the student to apply their knowledge in any kind of unfamiliar context – a vital skill for a scientist – and there is remarkably little quantitative content.

Which brings me rather neatly to the next important question.

Question the Second: What Is Difficulty?

The thing is, I do actually think exams have got easier. I just don't necessarily think that's a bad thing. Contrary to what Melanie Phillips may believe, that isn't because I'm some wet hand-wringing liberal who believes (as she puts it the book she wrote about an area in which she has never worked) that “all must have prizes” but rather because I'm an educational professional who knows quite how easy it is to write a hard exam question, and quite how hard it is to write an easy one.

Take, for example, the following question: “What is the boiling point of water.”

This question appeared on one of our in-house exams a year or so ago, and of course all of our students immediately answered “100 degrees centigrade” and were promptly awarded zero marks because the answer the exam-setter had been looking for was “the temperature at which liquid water becomes steam.”

Something you might have noticed if you read through all of those infuriating standards-are-slipping articles is that while the articles themselves talked about exams getting “easier” the word that was used whenever the original report was quoted was “demand.” Demand and difficulty are two very different things. Vague, ambiguously worded questions, questions that are not well laid out, or that leave the student otherwise unclear about what they are being asked make an exam more difficult, but they do not increase demand.

For example, the first question on the 1953 Physics paper reads:


Show that small vertical oscillations of a mass suspended by a light spring from a rigid support are simple harmonic. What condition must be fulfilled if the oscillation is to remain simple harmonic when the oscillation is no longer small? Why do the oscillations gradually decrease in amplitude as the mass continues to oscillate?
Describe an experiment in which a loaded spring is used to determine the acceleration due to gravity.
A spring is such that a load of 100gm stretches it by 20cm. When a load of 60gm is attached to the spring and set oscillating vertically there are 50 oscillations is 34.7 seconds. Calculate the acceleration due to gravity.


This question is not subdivided in any way, except by paragraph breaks. The first paragraph encompasses three questions, one of which actually contradicts the first two (the question “why do the oscillations gradually decrease in amplitude as the mass continues to oscillate?” assumes the existence of damping, but damped motion is not simple harmonic). The question could be made easier (and crucially no less demanding) by breaking it down into sections, rewording it so that it wasn't physically incorrect, and perhaps including a diagram to more clearly show the situation being described.

This, I suspect, is around the point where a certain sort of person starts to complain about “dumbing down”. Why, after all, should we bother to write exam questions just to make things easier for people who are too stupid to read them properly? Leaving aside, for the moment, the fact that I suspect the very dry, very formal, very wall-of-text format of the 1953 A-Level almost certainly disadvantages students from working-class backgrounds, as well as students for whom English is an additional language (and I confess that whenever people complain about falling A-Level standards, I often suspect that the real complaint is that blacks and poor people have started taking them), it's also just plain lazy.

The problem here is that most people from outside the world of education have a very narrow, very simplistic view of what an exam is for. Like the commenter from the Bad Science post, people really do want to know WHATS THE POINT OF AN EXAM FOR THICK PEOPLE? They think that the job of an exam is to let the “best” people show how good they are, and fuck everybody else. From this narrow point of view, making exams arbitrarily “hard” is indeed desirable. The more pointless, arbitrary hoops you make people jump through, the more likely it is that only the “best” people will get through those hoops. Where the “best” people are defined (circularly) as the ones who were most able to get through the pointless, arbitrary hoops in the first place.

This is just not what exams are for, and the notion that it is what they are for is elitist bullshit. “Elitist,” by the way, being one of those words (like arrogant) that stupid people think is secretly cool when it isn't. People think “arrogant” means “is awesome and knows it” when it actually means “is shit and doesn't know it,” while they think “elitist” means “tells it like it is and doesn't coddle people” when it actually means “is mortally fucking terrified of having to actually compete with people on a level playing field.” People want exams to be “hard” so that they can continue to feel superior to people who are less good at exams than they are. Monolingual English Speakers baulk at the idea that we could rephrase exam questions to make them more accessible to people whose native language is not English because we enjoy looking down on people who speak our language only-slightly-less-well than they speak their own.

Exams are there to test your knowledge of a subject. They're not perfect – in fact they're a lot like Democracy, they're the worst system except for all the others. Making exams easier is completely appropriate if the original source of the difficulty was something that bore no relationship to the subject of the exam. A couple of months ago there was a big stink about the fact that in the American SATs, exam essays were not penalised for factual errors, but this again was a perfectly sensible decision. If you want to test a student's ability to write, you don't dock them marks for writing about things that aren't true – nobody was penalised in their French Oral for saying their father was a Doctor when he was really a Taxi Driver.

There is, of course, a lot of grey area here, mostly around the distinction between structured and unstructured questions. It is certainly true that working through multi-part questions without guidance is a useful analytical skill, but it is a specific skill and should be tested only by those questions that are trying to test it. There is as much difference between a deliberately unstructured question and a badly structured one as there is between requiring my students to research a topic independently and simply failing to show up for work.

Again it's interesting to bring this all back to the 1953 exam paper, which is supposed to be our gold standard for a “good” “hard” A-Level exam. Again, I observe that the vast majority of what it tests is simple recall (which most educators and halfway sensible people agree is by far the least demanding mode of assessment). It would be an unmitigated disaster for education if we were to allow the “harder is better” doctrine to be taken even remotely seriously, because all we would wind up doing is turning exams into the equivalent of a badly written text adventure, where success depends entirely on your ability to read the mind of the person who wrote the damned thing.

Question the Third: Does it Matter if Exams get Easier?

The title of this article promises, amongst other things, “A Short Lesson in Controlling Public Discourse.” The “exams are getting easier” doctrine is a fine example of a particularly pernicious technique for controlling the public debate about a particular topic. The technique works as follows:


  1. Decide you want things to be a particular way

  2. Declare that things are now less the way you like than they were in the past

  3. Get angry about it



Immediately, the debate becomes about one thing and one thing only: whether your observation that things are getting less like you want them to be is correct. Meanwhile everybody – even people who really should know better – concedes your unstated assumption that it is desirable for things to be the way you want them.

The job of exams is not to be hard, it is not to be easy, it is most certainly not to provide the top five percent of students with a flashy qualification they can use to get into Oxford. The job of exams is to test learning and produce adequate differentiation across the full range of candidates. This, amongst other things, is why we need what that charming individual called “Exams for Thick People”. The job of an exam is not to let clever people show off, it is to actually assess people, and that means differentiating between D and E grade candidates just as much as it means differentiating between A and B grade candidates. Complaining that exams are getting easier is just a socially acceptable way of complaining that we're no longer restricting education to a privileged elite.

That isn't to say that there aren't problems with the current system. The current A-level system does not, by itself, provide adequate differentiation between the very best students and the merely very good. But this problem cuts far deeper than the difference between “easy” and “hard” exams, and this problem has always existed. At the highest levels exams simply become a bad method of distinguishing between candidates. There is, to put it simply, a reason that Oxford interviews people.

It is, as I said at the start, almost certainly true that exams are getting easier in some ways, but this is often for quite unexpected reasons. Perhaps most interestingly, the recent Ofqual report on the viability of the A-level suggests that the recent proliferation of multiple-choice questions and short structured answers over longer essay-style questions has more to do with making exams easier to administrate than anything else. There is simply a shortage of skilled examiners (students, and adults who work outside of education, tend to forget that people actually have to mark exams) which means that it is impractical to mark large volumes of complex written questions.

Perhaps what I find most infuriating about the “exams are getting easier” mantra, apart from the fact that it masterfully deploys a rather manipulative rhetorical strategy, is that it is based almost entirely on the very obsession with letter-grades that it condemns. Nobody bothers to look at actual questions, or actual syllabus content (at least not until Ofqual comes along and does it for them), nobody pays attention to what students are actually learning. People just look at a 1.7% increase in the number of A-grades and insist that it is incontrovertible evidence that “standards” are slipping. Just A-grades of course, because people really do not seem to believe that grades below A exist at all, or if they do they certainly don't seem to think that the people who get them are at all important (we're back to "exams for thick people" again). Is it any wonder that the number of A-grades grows year-on-year when in the eyes of half the population any grade below an A is a fail?

Would some people who got Cs at A-level in 1953 be better suited for university than people getting As at A-level now? Almost certainly. But I'll let you in on a secret. There are some people getting Cs at A-level right now who are better suited for university than other people getting As at A-level on the same exam. Exams, all exams, GCSEs, A-levels, Cambridge Pre-U, the IB, all of them, are imperfect tools. Worse, they are imperfect tools with no specific purpose – existing partly to help universities select candidates, partly as a qualification for eighteen year old school-leavers, partly as learning for its own sake. They need to impart subject knowledge, independent study skills, and maturity of outlook. They need to prepare students for university or for work, in England or abroad.

And different aspects of the function of A-levels are important to different people – my brother (a mathematician) insists that maths teaching in the UK is flawed because it focuses too much on making students rote learn specific techniques, instead of teaching them an understanding of the fundamental properties of number. The average UK journalist thinks that maths teaching in the UK is flawed because school leavers are bad at mental arithmetic and haven't rote learned enough specific techniques (the strangest article I've seen about maths teaching is this one in the Telegraph which somehow manages to cite the fact that thirty percent of parents said they were unable to do their kids' maths homework as evidence that maths teaching was getting worse).

There is no way on Earth that a single letter-grade can summarise the extent to which a student has internalised all of the things that they are required to learn over the course of their A-Levels or, for that matter, their entire school career. Making exams “harder” would not make that single letter-grade convey more information, or make it correlate better with all the dozens of different things it is supposed to correlate with. It certainly won't tell us who the “best” people are and it is absurd to suggest that it should.

It's finals season in Oxford at the moment, and throughout the city students are sitting their exams, and a good number of them are going to get better results than I did. And of course I would find it comforting to pretend that because I did my A-levels fifteen years earlier than them, that I am somehow smarter or better educated than they are, but it simply does not work that way, no matter how much I might want to.

The simple fact is that systems of education are not commensurable. Judging the value of a person's education by how closely it matches your own is the very definition of small-mindedness and unbecoming of a person who considers themselves educated.
Themes: Topical
~

bookmark this with - facebook - delicious - digg - stumbleupon - reddit

~
Comments (go to latest)
because I'm some wet hand-wringing liberal who believes (as she puts it the book she wrote about an area in which she has never worked) that “all must have prizes”

I'm not completely sure how "all must have prizes" is supposed to be a liberal idea. If you don't think rich white men are better than everyone else, it must be because you don't have any standards, or any idea of how to judge real quality? Any system that doesn't automatically award prizes to rich white men and fuck everyone else must be inherently flawed? Am I being too harsh here?
Guy at 08:36 on 2012-05-07
One of the best comments I've heard about exams was from a lecturer who was talking about how pressure is applied to markers to maintain a certain grade average because unless a certain number of people get low marks (or fail) then you're running a "Mickey Mouse" course; handing out low marks is a measure of your "seriousness". He said, the most serious course in the world is the St John's First Aid course, because knowing how to do that stuff really is a matter of life and death, and they have two grades: pass and fail. And if you fail, there's no recriminations or shame or suggestions that you're thick; you just do the course again. Because either you know the stuff to a good-enough standard or you don't.

There is a part of me that worries about rampant grade inflation - in some parts of the world more than others - because it has a whole bunch of effects on the way teaching is done and the way students work. I think the worst thing, though, is the way that exams, by performing various social "gatekeeping" functions, end up assuming an importance which turns the purpose of education on its head; instead of the exam being there to test and motivate the work of teaching and learning, teaching and learning are redefined as modes of development for people's ability to pass exams. But I suspect that's a whole separate barrel of issues...
Dan H at 11:55 on 2012-05-07
@fishinginthemud

I'm not completely sure how "all must have prizes" is supposed to be a liberal idea.


Because it relies on the scary, dangerous idea that poor people and immigrants have as much right to be served by the exam system as rich white people.

A lot of the changes to A-levels have made them a better test of subject knowledge and a worse test of things like "whether you're used to formal-register prestige-dialect English" and "whether you went to the sort of school that did good exam prep." A lot of people think these changes are bad because a lot of people genuinely, ideologically believe that it is desirable for us to label some people as objectively "worse" than others, because "that's how the world works" and because it encourages "competition".

@Guy

One of the best comments I've heard about exams was from a lecturer who was talking about how pressure is applied to markers to maintain a certain grade average because unless a certain number of people get low marks (or fail) then you're running a "Mickey Mouse" course; handing out low marks is a measure of your "seriousness".


Perhaps it's just because I've been reading Chavs, the demonization of the working class but I do think that a lot of this comes back the British obsession with our allegedly-non-existent class system. Part of the reason A-levels don't do their job particularly well is that most people think that the job of A-levels is to tell us who the "best" people are.

There is a part of me that worries about rampant grade inflation - in some parts of the world more than others - because it has a whole bunch of effects on the way teaching is done and the way students work. I think the worst thing, though, is the way that exams, by performing various social "gatekeeping" functions, end up assuming an importance which turns the purpose of education on its head


Oh very much so. We have particular problems with this in our school because we're a for-profit organization, and what our marketing department wants is students who get lots of As at A-level, and this attitude filters down to our students, so not only do they think anything below an A is a fail, but they do more subjects than they have to because they falsely believe that getting a larger number of As makes them more likely to get into university.

It's the same ignorant attitude that leads to an outcry every time somebody with four or five As at A-level is rejected from Oxford. People think that A-levels are supposed to be university tickets, and they aren't.
Andy G at 14:05 on 2012-05-07
Dan - thanks, this is a brilliant article.

I had two questions that may also (as Guy put it) be separate barrels of issues, but are hopefully relevant:

Firstly, there's sometimes an assumption that exams are meant to be a final, once-and-for-all assessment of how good a candidate is; a contrasting way of looking at it would be the driving lessons/St John's Ambulance model, where the point of the exam isn't to categorise you once and for all as a Good Driver/First Aider or a Bad Driver/First Aider, but to assess whether you've reached the criteria yet or need to go away and try again later. Lots of the furore about people being allowed to re-take A-level modules (also supposedly a case of "dumbing down") is rooted in the assumption that exams should fit the former model rather than the latter. But presumably, if all exams were taken repeatedly until you passed them, that could be demoralising. What do you think is the appropriate form of assessment?

Secondly, another part of the "dumbing-down" case is that students are spoonfed narrowly circumscribed chunks of information purely to help them jump through the hoops of the exam assessment criteria, rather than being given a holistic education like in the good old days when teaching was about teaching and not about exams. What are you thoughts on that?



http://openid.aliz.es/sam at 14:19 on 2012-05-07
This is an excellent article. It's a bit strange seeing this reactionary flare-up in May; usually it's scheduled for August when exam results come back!

One criticism I have is that there's no discussion about the different purposes of exams. I would identify three:
* measuring collective student ability, which is mostly useful as a metric to evaluate other things indirectly;
* measuring individual student ability against fixed benchmarks;
* measuring individual student ability against their peers.

Different (and, to an extent, conflicting) properties of exam schemes make them suitable for each of these purposes. Fixed grades, consistent year-on-year marking and stable content suit the first two. Scaled marking, a high degree of gradation in marks available and lots of marks to even out random variation is good for the third. A-levels are sort of shoehorned into all three roles: A*-C pass rate is a standard yardstick for comparing colleges; it's desirable to know that if a student gets a good grade in their, say, physics A-level they know a few core things about physics; and, of course, marks are the primary way for universities to learn applicant ability.

Particularly problematic is the first purpose for exams. Since any metric can be gamed, I would in fact be astonished to learn if there *wasn't* inflation in results going on at some level or another (whether it's pass rates or A*-C grades or A grades or whatever). Governments and schools quite like being able to wave 'best ever' exam results around to prove that they are Successful and that Progress is taking place.
Dan H at 14:54 on 2012-05-07
@Andy

Resists are a tricky one, on the one hand it's reasonable to argue that somebody who gets an A at A-level obviously has an A-grade level of knowledge, whether they get that grade on their first attempt or their fifth, on the other hand resits do have other negative consequences (which tie in strongly with the second issue). More resits means more exams, which means more exam focus and more focus on grades, grades, grades.

So on the one hand I think a lot of the stigma attached to resits is the consequence of snobbery (a lot of people complain that resits are bad because "that isn't how it works in the real world" which is, well, kind of horseshit - most qualifications that aren't designed to be set at a particular time in your life can be taken as often as you want). On the other hand resit culture is a side effect of grades culture which is itself a side effect of people obsessing about "standards" when "standards" are defined solely in terms of who gets an A-grade.

A big problem a lot of teachers identify with the current modules-and-resits system is that students tend to forget things the moment they've finished the exam (which means, ironically, that the students who resit more actually wind up understanding things *better* rather than less well, because they've had to keep reinforcing their understanding of earlier topics). So I think it's true that a lot of the criticisms of resits are invalid (because passing the test means you can do the things the test tests, whether you passed on your first attempt or your fifth) but that doesn't mean there aren't also valid criticisms.

I'm also not sure it's appropriate to compare A-levels to the St John's Ambulance or a driving test, because those tests are trying to teach specific skills with a specific purpose, and in that case all that matters is whether you can do it or not. A-levels are very different, in that their aim is to teach a more general set of skills, and it is less clear that "being able to pass the exam" is necessarily the same as "having the abilities for which the exam is supposed to test." Although as I say, the problem here isn't really "resits" per se.

As for the good old days when teaching was about teaching, obviously I've not not been through that educational system myself, but I'm always highly suspicious of arguments that hark back to "the good old days". Ultimately, teaching to the exam will always produce better results than teaching a broad understanding of the subject, because it's always easier to learn a word-perfect answer to a set question than to learn to synthesize your own answers to a range of questions. I certainly don't believe that the structure of exams was less amenable to exam-focused teaching in the past (and looking at the 1953 A-level paper linked above, all of the questions are ones you would have to be prepped for in advance).

Teaching to exams is very much a symptom of a grades-focused culture, and it's certainly possible that the culture in education has become more grades-focused, but that has been caused in part by a (very desirable) drive towards greater transparency (although it has also been driven by a rather less desirable exposure of schools to market forces).

The problem (outlined in the Ofqual report I link above, which is a good read if you're interested) is that it is pretty much impossible to have an exams system which both meets necesary criteria for transparency, validity and reliability, and is also resistant to teaching to the exam. If you know what the exam is looking for, you can teach people those things, if you don't know what the exam is looking for, the exam is unfair because you may teach students things for which they get no credit.

Part of the problem, of course, is that "holistic learning" is extraordinarily hard to define. I remember having a not-exactly-argument with my mother a few years ago after watching a production of The History Boys because I told her that I thought that Mr Irvin was a massively better teacher than Mr Hector. I think a big part of the "good old days" attitude to teaching is that it thinks teachers should be like Mr Hector, sort of vaguely exposing students to ideas in the hopes of inspiring them with some magical concept called learning, whereas I far prefer the approach taken by Mr Irvin - sitting students down and teaching them specific things and judging your success by whether or not they can do those things, not by how good you feel about yourself.

To put it another way, the danger of a good-old-days "holistic" attitude to teaching is that it fails basic SMART analysis. It's all very well to say that teaching shouldn't be about exams, but without exams we have no way of knowing if the students are learning anything at all.

To put it another way, with a well designed exam system teaching to the exam should be *the same thing as* teaching for its own sake. A-levels (at least with my board) have got (broadly speaking) a lot better at this in the last ten years. It is very common now for A-level physics papers to contain questions which specifically require the student to apply their existing understanding of the subject to new areas. Now obviously I still try to train my students to answer these questions, but if the questions are designed properly, the ability to answer those questions should be the *same* as the ability to apply existing understanding to new situations.

So yeah it's, umm, complicated. Which is why "dumbing down" arguments annoy me so much.
Dan H at 15:06 on 2012-05-07

One criticism I have is that there's no discussion about the different purposes of exams. I would identify three:
* measuring collective student ability, which is mostly useful as a metric to evaluate other things indirectly;
* measuring individual student ability against fixed benchmarks;
* measuring individual student ability against their peers.


I think it's even more complicated than that, actually, in that the three examples you give are just three different applications of one function of exams ("measuring student ability") which is itself quite a hard thing to define.

Ofqual identifies several different functions for A-levels, including but not limited to:

* As a school-leaving certificate for 18 year olds who do not wish to progress to university
* As a means for universities to select students
* As a means for employers to assess potential employees
* As a means to prepare students for university study
* As a means to prepare students for life outside school in a general way
* As a means of delivering specific and detailed subject knowledge
* As a means of introducing students to a variety of ideas and concepts that may help them make future career decisions

And so on.

Of course as far as journalists are concerned, A-levels have one purpose which is:

* To maintain "standards".

Whatever that means. Sometimes I wonder why we don't just let fleet street run the country, after all, they have a much better idea of how things should be done than we mere professionals.
http://scipiosmith.livejournal.com/ at 16:17 on 2012-05-07
People want exams to be “hard” so that they can continue to feel superior to people who are less good at exams than they are.


I think it's half this and half the continued industrialastion of education, where all that matters is how you use your grades/degree to 'benefit the economy' and improve your bank balance. Not that there's anything wrong with aspiring to a 100k salary, but I once worked with a guy who told me in all sincerity that he thought student loans should only be available to people reading "important" subjects at "first rate universities" and everybody else should have to fork out the cost themselves up front; which I think neatly incapsulates both attitudes.
Because it relies on the scary, dangerous idea that poor people and immigrants have as much right to be served by the exam system as rich white people.

I think the problem also comes from our unconscious assumptions about who should do well on tests and who shouldn't. When boys do better on the math section of the SAT than girls, that's evidence that boys are better at math than girls. When girls were doing better on the verbal section than boys, that was evidence that there was something wrong the verbal section, and it had to be changed.

As you say, a lot of people would like to refer to standardized testing as evidence that they and people like them are superior to those who didn't score as high as they did. The SAT is at least partly calibrated to be an "intelligence" test, and therefore an indicator of self-worth and a method of ranking people, much more than a measure of how much a student has learned.

And in the case of the SAT, as well as graduate school exams like the GRE, GMAT, and LSAT, if you want to stop people from "teaching to the test" and not give students who can afford expensive prep classes an advantage over those who can't, you have to make the tests as "unteachable" as possible. You have to make them more like traditional IQ tests and less like tests of educational content. The problem is that no one is really sure what IQ tests themselves are measuring.
Dan H at 19:43 on 2012-05-07
You have to make them more like traditional IQ tests and less like tests of educational content.


That's the thing, though, IQ tests are totally teachable. You do a lot of IQ tests, you get better at doing IQ tests.
And you also mentioned that "teaching to the test" shouldn't be a bad thing; it should just be like normal teaching. The test should be calibrated to measure how much you learned of what you should have learned, and how well you can apply that knowledge in situations you haven't seen before. It's not all that hard.
Now that I think about it, the majority of the GMAT and GRE students in my classes are people who have forgotten most of the math they learned in school ten or twenty years ago, and can get their scores up a hundred points or so just by reviewing basic algebra and geometry.

The market I need to tap into is students who got pretty good scores but want awesomely high scores. Because the GMAT, for example, wants to distinguish as much as possible between the very good and the very best, the difficulty (or demand) gap between, say, a 700 and a 750 is much wider than the gap between a 600 and a 650. A lot of students imagine there are tricks and secrets to getting super-high scores that a prep class is supposed to give them, and don't realize that the only way they're going to get those scores is to learn the material honestly. I have a feeling that the more you can get them to think they're gaming the system while actually teaching them honestly, the more you'll reach them.
Dan H at 20:48 on 2012-05-07
I have a feeling that the more you can get them to think they're gaming the system while actually teaching them honestly, the more you'll reach them.


I'm reminded of an old sitcom episode from years ago in which the main character's brother was attempting to cheat for a test by smuggling in notes and in the end observes:

"All those times I copied the book onto my arms, I realised that I didn't need to look at the book any more, and I realised I'd discovered the perfect way to cheat. I hid the information in my brain."
http://derigueurly.wordpress.com/ at 00:25 on 2012-05-08
"but Journalists[sic] are as disingenuous as ever [...] The average UK journalist [...] as far as journalists are concerned [...] I wonder why we don't just let fleet street[sic] run the country, they have a much better idea of how things should be done than we mere professionals."

Did you really just smear a whole professional class while complaining about the smearing of a whole professional class? Classy.

Mind you any fool can write an article. It's not like journalism requires any specialised skills or training...

I'm an 'average journalist', by the way, which means I write for trade publications. Like the vast majority of my peers I've never said anything in print about exams, and I admire the teaching profession.

"we mere professionals [...] for-profit organization [...] marketing department"

Hm, interesting. Would you like to tell the class what professional teaching qualifications you have? That didn't come up in the article.
Modern "mainstream journalists" are irresponsible because the type of reader who fails to make use of modern technology to view sources directly is an ideal person to serve adverts to. Some would go so far as to call them useful idiots. No arguments there.

A-Levels that don't require you to understand badly worded questions? I agree that these will test subject knowledge better, but they're a worse test of your ability to apply this subject knowledge to both the real world (whose questions are always badly formed) and how well you will perform at university (where your lectures and exams are as bdaly formed as the A-Level questions). The 1953 paper is closer to the type of question I need to answer as both an Engineer in the real world today and as an undergraduate in 2003 than the A-Level papers I had to answer in 2001 and 2002.

I'm of the view that recall is less important now than it was in the past - why waste time memorising things when there's Google to index your brain (and that of others) for you?

Knowledge of pure science is also wonderful, but in itself virtually useless unless you are able to apply it. Congratulations on your A - come back when you're able to tie your shoelaces without me structuring the task for you please. The ability to apply more basic knowledge, or working out what the deliberately badly structured question is actually asking - more important than ever.

Older exams (set at one point in time) were better at this than modern exams (spread over time, plus coursework) are. For the purpose of filtering university entry anyhow, which is what these papers are designed to do. They are not intended to demonstrate subject knowledge. As such - yes, they're easier now than in 1953 and as this makes them less useful for filtering university entry this makes it a bad thing.
Shimmin at 07:26 on 2012-05-08
I'm an 'average journalist', by the way, which means I write for trade publications. Like the vast majority of my peers I've never said anything in print about exams, and I admire the teaching profession.

To be fair, when people say 'journalist' they almost never mean writers for trade publications, they mean writers for newspapers, and the rest of the article is clearly talking about that group of people, not about you.

Did you really just smear a whole professional class while complaining about the smearing of a whole professional class? Classy.

Um... actually, he didn't complain about smearing of teachers. "Fleet Street act like they know better than teachers what exams are for and how they should work" is not the same as "Fleet Street are smearing the profession of teaching."
Dan H at 09:33 on 2012-05-08
Did you really just smear a whole professional class while complaining about the smearing of a whole professional class? Classy.

Sorry, you're right, I was making cheap generalizations. I'm afraid people (journalists or otherwise) bashing the education system makes me very angry, and I allowed that anger to get in the way a little. You're right of course that I shouldn't be bashing "journalists" as a whole when I'm really only referring to a relatively narrow subsection of the profession. Like most laypeople, I tend to forget that "journalist" doesn't just mean "people who write for national newspapers".

And ultimately what I'm really guilty of here is using "journalist" as a shorthand for "that particular vein in public opinion which is reflected by a particular variety of right-wing tabloid journalist, and supported uncritically by much of the rest of the mainstream news media." Which was, again, wrong of me and I genuinely apologise.

Hm, interesting. Would you like to tell the class what professional teaching qualifications you have? That didn't come up in the article.


I'm more than happy to be upfront about this (if I wasn't I wouldn't have included phrases like "for profit" and "marketing department" in the original article).

I don't have a PGCE, professional teaching qualifications in the UK are mostly geared towards teaching in the state sector, and they aren't required to work in the private sector (I work for an international school). That said, I *do* have an actual degree in Physics, which roughly 75% of Physics teachers in state schools do not. Which of us is better qualified is genuinely a question for debate.

I've been teaching for around six years, and have learned most of what I know on the job. I've worked closely in syllabus development for our in-house exams, so it's an area I understand well.

Once again, I'm genuinely sorry that I attacked your profession. It was, well, unprofessional of me.
Dan H at 09:42 on 2012-05-08
The ability to apply more basic knowledge, or working out what the deliberately badly structured question is actually asking - more important than ever.


I think you're giving the 1953 paper far more credit than it deserves. As I think I point out above, there is a far, far bigger difference between a *deliberately unstructured* question and a question that is merely *badly worded*.

Arguing that exams should be deliberately badly worded because real life is badly worded is absurd (much as I hate to quote XKCD, communicating badly and then laughing at people who misunderstand you does not make you clever). Exams should be deliberately designed to *test your ability to apply your knowledge to unfamiliar situations* (which the 1953 exam most certainly does *not* do - every example it asks for is very, very standard) but assuming that you can achieve the same effect merely by writing a shoddy paper is extremely naive.

Older exams (set at one point in time) were better at this than modern exams
(spread over time, plus coursework) are. For the purpose of filtering university
entry anyhow, which is what these papers are designed to do.


It's funny that you mention coursework as one of the things that has made exams worse, because again, a common criticism from *actual universities* (and Ofqual) is that the loss of coursework options has made exams prepare students less well for university.

I think the mistake you're making is in assuming that an exam which *superficially resembles* a university exam must necessarily be a better check of a student's readiness for university than an exam which does *not* resemble a university exam, but which actually tests for the skills a student is supposed to have.
http://derigueurly.wordpress.com/ at 10:31 on 2012-05-08
Daniel - Apology accepted, no harm done except to my blood pressure. You might want to edit the article, especially the strapline, which is currently still a cheap insult to a whole profession.

I don't think you should bash Phillips for her lack of credentials without being clear about your own up front. Whatever your intentions, it weakens your argument by making you look disingenuous. But call that a suggestion, not a complaint.

"Some would go so far as to call them useful idiots"

If you want to call 'us' idiots, come out and say it, don't hide behind a form of words. If you do, though, I'll call you an idiot, and not a useful one. How would you 'make use of modern technology to view sources directly' when the source is an interview? How many people in the world have the background to understand any industry memo or any research paper? Do you think editorial and comment pieces are completely without value?

Shimmin - apparently Daniel disagrees with you, but this

"To be fair, when people say 'journalist' they almost never mean writers for trade publications, they mean writers for newspapers, and the rest of the article is clearly talking about that group of people, not about you."

makes my blood boil all over again. 'group of people? 'newspapers'? I'm guessing you mean national newspapers. Like those close cousins, the Sun and the Guardian and the Racing Post? And the BBC website that Dan links to in the first paragraph. Is that a newspaper too? What about TES columnists? Are they always talking ignorant rubbish about education too?

Ben Goldacre, by the way, writes for newspapers. He makes a living in popular science journalism partly by bashing popular science journalism, and he's very entertaining. But don't mistake it for academic inquiry.
Shimmin at 10:54 on 2012-05-08
The ability to apply more basic knowledge, or working out what the deliberately badly structured question is actually asking - more important than ever.

In "deliberately badly structured question", the crucial bit is still "bad". It's introducing a layer of obfuscation, because now the question is testing not only your knowledge of (say) physics, but your ability to decode the examiner's intent without the opportunity to ask further questions. This basically boils down to mind-reading, and favours pupils who are specifically prepared for such questions, and who share the examiner's background and native language and dialect. In other words, these reinforce the kind of elitism Dan mentioned earlier.

If it's necessary to test people's ability to interpret (or indeed recognise) badly-structured assignments or questions, then test for that as well, but conflating that ability with the unrelated ability to understand and apply physics - and then using it as a measure only of the latter - is not constructive. Interpretation is a useful skill, but generally speaking if people give you poorly-worded instructions or requests then you ask for clarification. Exams don't really work as a test for your performance in real life, because real life doesn't happen under exam conditions.

I think the mistake you're making is in assuming that an exam which *superficially resembles* a university exam must necessarily be a better check of a student's readiness for university than an exam which does *not* resemble a university exam, but which actually tests for the skills a student is supposed to have.

I'd add to this that only a relatively small proportion of university consists of exams, and this is pretty variable by subject, which reinforces this point. Coursework is a massive chunk of many subjects (dissertation, anyone?) and research, discussion and presentation are important elements that can't be tested by exams.
Dan H at 11:11 on 2012-05-08
Daniel - Apology accepted, no harm done except to my blood pressure. You might
want to edit the article, especially the strapline, which is currently still a
cheap insult to a whole profession.


I've left the text as it is, but added an apology to the top (I don't like editing things after they're published because it feels like I'm trying to deny my mistakes - I did something similar when I called JK Rowling a smug bitch) again I'm genuinely sorry.

I don't think you should bash Phillips for her lack of credentials without being
clear about your own up front. Whatever your intentions, it weakens your
argument by making you look disingenuous. But call that a suggestion, not a
complaint.


It isn't her lack of professional qualifications that bugs me, it's the fact that she has no connection with education whatsoever. She's a career journalist. Although I don't talk about my qualifications in the article I do, I think, make it fairly clear that I actually am a teacher.

Obviously I don't think that opinion pieces and editorials have no value, but when an opinion piece seems to me to be written from ignorance, I don't think it's unreasonable to highlight that. You, quite rightly, highlighted the fact that I had ignorantly mischaracterised your profession, for example.
Andy G at 11:13 on 2012-05-08
"How would you 'make use of modern technology to view sources directly' when the source is an interview?"

I think the demand for sources/references to be linked was meant more with regard to research pieces rather than interviews. Though actually it is increasingly common for audio/video of interviews to be shared in articles, and I think that's a welcome development where it's possible.

"How many people in the world have the background to understand any industry memo or any research paper?"

Hundreds of thousands, if not millions. I don't think journalists (or academics/industrial experts for that matter) should be gatekeepers of knowledge who get to decide what other people will or won't understand. Since there are no restrictions of space online, it should be good journalistic practice to link directly to research papers - it's up to the reader to decide whether to click-through.

"Do you think editorial and comment pieces are completely without value?"

I'm not sure who's supposed to think that, but it would be rather inconsistent given that this very article is a comment piece.
Shimmin at 11:49 on 2012-05-08
@Deriguerly:

I think you may have misunderstood my intent. I wasn’t attempting to defend Dan. I just slightly disagreed with your interpretation: it didn't seem to me that he was complaining about teachers being smeared, but rather about 'journalists' claiming expertise in educational theory. It’s not an important point, though.

makes my blood boil all over again. 'group of people? 'newspapers'? I'm guessing you mean national newspapers. Like those close cousins, the Sun and the Guardian and the Racing Post? And the BBC website that Dan links to in the first paragraph. Is that a newspaper too?

I'm sorry you're upset, but yeah, that's about right, I do in fact see them as a vague group. Ordinary people's definition of 'journalist' is not the same as the professional's in the field. I can see how that would annoy you, it's a shame, but I don't think there's much to be done about it. Nurses, accountants and engineers (amongst others) have the same problem, but unfortunately your fairly broad profession is being defined by a narrow and sometimes unpopular group because they're the most prominent.

I'm having to work this out on the fly here, but if I hear people talking about 'journalists', I generally assume they're talking about newspapers and magazines (most likely but not necessarily national ones), maybe TV and radio depending on the context, and about generalists rather than experts producing specialist pieces. I don't immediately think of writers at New Scientist or The Irish Beekeeper. Like any definition it's very much down to context, but I don't assume they're talking about the broader field without indications that way, which I don't pick up from this article. At the same time, since the article discusses journalists who talk about exams, that immediately excludes everyone who doesn't.

With respect, I'm not really up for an argument about the correct definition of 'journalist', so this is my last word on the subject.
Arthur B at 11:59 on 2012-05-08
Do you think editorial and comment pieces are completely without value?

You've made a mild jump here. Journalists digesting and regurgitating in less technical language the conclusions of a report and various reactions to it are an undeniably useful function, because who seriously has the time to read every single primary source on every news story they are interested in? Editorials and comment pieces aren't about that, though, they're about presenting and promoting a particular viewpoint or opinion.

I would say the value of editorials or comment pieces hinges directly on whether they're presenting the opinion of someone credible who has in fact considered and studied and worked with the issue at hand in depth (or are at least generally conversant with the field in question), or whether they are drive-by knee-jerk reactions more concerned with entertaining fans of the columnist in question's style than necessarily approaching a subject in depth.

I'm sure we can all cite examples of both. I imagine in trade journals (or The Racing Post, which I'd personally characterise as a magazine which happens to be printed on newsprint as opposed to a newspaper because in my mind "newspaper" means "generalist news journal") the balance tends to lean towards the former.

As far as mainstream newspapers go, well, Richard Littlejohn and Mel Phillips are two of the louder and more enervating examples of the drive-by opinionating school of commentary but pretty much all the major papers have their own in-house blatherers who prompt eye-rolling from me whenever I see their name at the top of an article. And of course there's numerous people who straddle both sides of the line; when it comes to Charlie Brooker's columns, for instance, I find the degree to which they are insightful and useful to be directly proportional to the extent to which they relate to television or video games, because those are subjects the man's given a lot of thought to and has spent a fair amount of professional time becoming conversant with and he's reasonably good at analysing them. When it comes to, say, politics or the Royal Family or whatever, his opinion is no more interesting than that of any random articulate person holding forth at the pub.

Now, when it comes to the "A-levels are dumbing down/education is going down the pan/the current generation have it too easy" dirge we regularly get, the people I usually see driving it aren't the writers of the TES, who are usually marginally more nuanced about this stuff. It's the Littlejohns of this world.
Dan H at 12:26 on 2012-05-08
Now, when it comes to the "A-levels are dumbing down/education is going down the
pan/the current generation have it too easy" dirge we regularly get, the people
I usually see driving it aren't the writers of the TES, who are usually
marginally more nuanced about this stuff. It's the Littlejohns of this world.


More or less this. Although in this case Deriguerly is quite right that it is inappropriate to treat the entire profession as if it consisted of Richard Littlejohns.
This basically boils down to mind-reading, and favours pupils who are specifically prepared for such questions, and who share the examiner's background and native language and dialect. In other words, these reinforce the kind of elitism Dan mentioned earlier.

It's the kind of elitism that decides that the people most like you are the ones most qualified to do your job. Of course if an overall reduction in cronyism would have hindered someone's educational and professional prospects, I can see why it might make them uncomfortable.
Guy at 23:40 on 2012-05-08
Isn't Charlie Brooker the guy who broke the story on David Cameron being a lizard? He must have at least some political (or perhaps zoological) expertise to have uncovered the facts there...
Arthur B at 01:14 on 2012-05-09
No, that'll be David Icke.

Then again, David Icke really ought to have a column in one of our tabloids. If they're going to regale us with the unsupported assertions of cranks, may as well go for the crankiest of them all...
Alasdair Czyrnyj at 02:12 on 2012-05-09
@Guy: No, Charlie Brooker is the secret Nazi.
Janne Kirjasniemi at 21:10 on 2012-05-09
An interesting piece. I get the impression that this is part of a bigger discussion on education going on in the UK, since it is such a popular subject for the mainstream media tohandle poorly. On the other hand, if the exams are getting better as a tool of testing student's knowledge and are more understandable, the people who are responsible for these tests are more professional.

While it is clear that different parts like Wales and so on will have their own system, it is somewhat confusing to hear that there is such a great number of these tests available, or did I get this wrong? And that they have fixed grades to certain amounts of points, which are set beforehand? The equivalent test here in Finland, the Matriculation examination is graded according to the gaussian curve, where the bottom 5% fail and the top 5% get a Laudatur and rest of the grades fall between. Of course there are some marked threshold amount of points to avoid the situation where someone would fail or get a bad grade simply because of a statistical fluke, but this seems to not be the case usually, since there are tens of thousands of students participating in the tests. But if one wants to give out grades, it seems to work out fine since different tests can't really be compared with each other.

That the questions should be hard to understand because of real lfe situations I just can't understand. Surely students should be taught knowledge of a subject and tested accordingly. Understanding unclear communication is a different subject alltogether and while it may be common in the real world, surely the point is to teach people to be clear and not encourage such bad communication in schools and universities. And surely a student taking their A-levels or graduating as an engineer is at the start of their education or career and figuring people out is a necessary part of learning through working and gaining experience which simply cannot be taught.

On the mathematics thing, there used to be an educational movement called the New Mathematics in the sixties, which originated in the US, which sought to teach kids group theory and to calculate with different bases and whatnot. The problem wasn't that the kids couldn't learn, it was that the teachers were not trained to teach in that way and the parents could not make any sense of the homework at all, which led to fear etc. One could imagine it could work gradually, but the pressure would be great for the teachers in the lowest classes, since such a thing should be started early and at least in Finland, in the lowest grade there are teachers who teach all subjects to the class and are thus not required to have a master's degree in mathematics for example.
Dan H at 21:35 on 2012-05-09
While it is clear that different parts like Wales and so on will have their own system, it is somewhat confusing to hear that there is such a great number of these tests available, or did I get this wrong?


Exams in the UK are set by exam boards, of which there are a fair few, schools pick the board that best suits them. So not everybody in the country sits the same exam.

And that they have fixed grades to certain amounts of points, which are set beforehand?


It's a little from column A, a little from column B. The grade boundaries are set after the exam is sat, with the exam boards doing some kind of statistical black magic to decide where the grade boundaries should be. Part of the consideration is making the results fit the expected pattern based on historical data, but it's not quite as straightforward as just giving the top 20% an A, and the proportion of students getting A-grades varies significantly by subject (so for example around 70% of people who do an A-level in Mandarin get an A, because most of the people taking A-level Mandarin are actually native speakers. Conversely very *few* people actually get an A in Accounting (which doesn't stop it from being seen as a "soft" subject by most schools and many universities).

That the questions should be hard to understand because of real lfe situations I just can't understand.


Me neither, I think it's just a classic "life isn't fair" argument.

On the mathematics thing, there used to be an educational movement called the New Mathematics in the sixties, which originated in the US, which sought to teach kids group theory and to calculate with different bases and whatnot.


Interesting. I'd always wondered what that Tom Lehrer song was about...
On the mathematics thing, there used to be an educational movement called the New Mathematics in the sixties, which originated in the US, which sought to teach kids group theory and to calculate with different bases and whatnot.

Which failed because, as you say, the educators didn't have the background to understand or teach the material themselves. The idea was to get kids interested in math by showing them the "fun" stuff that mathematicians find intriguing rather than the boring rote work of learning times tables. The result was that kids continued to hate, fear, and be bored by math, and in addition were now leaving school not knowing how to multiply or divide without a calculator.
Dan H at 09:37 on 2012-05-10
The idea was to get kids interested in math by showing them the "fun" stuff that
mathematicians find intriguing rather than the boring rote work of learning
times tables.


To be fair to "New Math" I think this is a bit of an oversimplification. My limited understanding of the movement is that it focused on teaching general principles over specific applications (which is very different to just teaching "the fun stuff").

Were it not for the fact that virtually nobody was qualified to teach it, it was (arguably) a rather good idea.

The result was that kids continued to hate, fear, and be bored by math, and in
addition were now leaving school not knowing how to multiply or divide without a
calculator.


I think the problem here is deciding what maths teaching is for. I'm actually rather suspicious of the attitude which assumes that the job of maths teaching is to get kids to learn to multiply and divide without using calculators, because it seems grounded in the "good economic units" philosophy of education.

If we had the same approach to teaching English that we have to teaching Maths, we'd spend ten years of compulsory education teaching students to spell increasingly complicated words, but never require them to read a novel, a play or a poem.
If we had the same approach to teaching English that we have to teaching Maths, we'd spend ten years of compulsory education teaching students to spell increasingly complicated words, but never require them to read a novel, a play or a poem.

I would say if USians were getting the same results teaching English that they get teaching math, they would have a significant proportion of high school and college students rereading The Cat in the Hat every year and never understanding what the hell Dr. Seuss is going on about. I don't think kids are going to get anything out of literature if they have trouble puzzling out basic sentences.

My problem is I can't manage to say any of this without sounding like an enormously smug, elitist asshole, but that's honestly not my intention. I'm just getting angry at the amount of time kids are wasting in school, and how thoroughly they're being cheated out of an education.
Sunnyskywalker at 18:06 on 2012-05-10
The result was that kids continued to hate, fear, and be bored by math, and in addition were now leaving school not knowing how to multiply or divide without a calculator.

And now they're the math teachers. I have known several people who wanted to be elementary school teachers because they weren't good at math and thought teaching lower grades was a good way to avoid doing too much of it. But "math anxiety" does filter down to the kids, and it's a problem.

Back to the tests for a moment... I'm currently working at a job where we score standardized tests (not the SAT), and I have to say, if people gave more thought to how these things are scored, they might be less enthusiastic about the poorly-worded and extremely open-ended questions. It's very hard to (a) get a rubric that will cover most kinds of answers you'll encounter and rank them in a logical way and (b) to get a room full of people to score long written responses consistently. Even if that kind of test would be better for the students for some reason, it's still no good if it can't be scored consistently.

Worst case scenario (which I haven't seen yet, fortunately - so relax, US students!), you'd end up with the evaluators going, "Well, there's nothing about this kind of answer on the rubric, but this one seems more interesting than that one I saw earlier even though the content is essentially the same... what did I give that other one, anyway? Oh well, can't find it, I'll just give this one a high score because it seems good to me. And this student spells better, which I like." Meanwhile, the person next to the first is giving that not-covered-in-the-rubric answer an entirely different score when it comes up, because s/he doesn't like it as much for whatever reason, and neither of them decided to ask the supervisor what the team should do about this particular answer that keeps coming up. How useful would those results be as a metric?

Which isn't to say that you can't have open-ended questions and still score them and get results useful for... something. But they had better be very well-designed questions, and whoever makes the rubric had better be really on the ball.

And that's not even covering the situations like the "what is the boiling point of water" question, where there may be more than one reasonable answer but you're only allowed to give credit for one of them. Unless the test is specifically marketed as "the one where you have to figure out the examiner's arbitrary decisions based on no evidence, kind of like you'll do with your boss someday," no one looking at the scores will know what that test is really measuring. Not subject knowledge, at least not well.
Janne Kirjasniemi at 23:13 on 2012-05-10
The result was that kids continued to hate, fear, and be bored by math, and
in addition were now leaving school not knowing how to multiply or divide
without a calculator.



While that may have been true for many kids, I have heard of good results as well and in a way, the New Mathematics did influence teaching of math even after it. I would think that even with justlearning rules and techniques only, you will still get students who hate mathematics and still don't know how to do anything without a calculator. It is a permanent problem in pedagogy that a motivated kid is able to learn pretty much anything and an unmotivated one will resist any teaching. And of course no kid will always stay motivated and some will not be motivated at all, a problem which used to be solved by trying to force the kids through authority or earlier still, violence.

It is an intriguing thought that mathematics could be taught through concepts and trying to understand how it works. But of course mathematics is often seen to be a matter of practicalities, rather like driving a car. You don't need to know exactly how a combustion engine works, as long as you can keep a motor running and you don't have to learn set theories, if you can at least multiply in your head and do long divisions and stuff, so that should hopefully get you sorted on how the interest works on your mortgage.
Dan H at 23:15 on 2012-05-10
@fishinginthemud

My problem is I can't manage to say any of this without sounding like an enormously smug, elitist asshole, but that's honestly not my intention. I'm just getting angry at the amount of time kids are wasting in school, and how thoroughly they're being cheated out of an education.


I don't think you're coming across as smug or elitist at all - education is a complicated area and people have strong feelings about it.

Obviously I can't talk about the US system, I just get very twitchy around the idea that mental arithmetic is the primary measure of effective maths teaching. I appreciate that people often use it as a shorthand, but I meet a lot of people who really do seem to think that the fact that they can divide 428 by 7 in their heads makes them smarter than people who can't.

I also have to deal with a lot of students who came from school systems that focus *very strongly* on teaching things like mental arithmetic, and they tend to be extremely bad at actually applying their ideas to new situations, because all they've ever been taught to do is apply methods. They also overestimate the amount of things they can do without a calculator and so they make simple, avoidable mistakes.

@Sunnyskywalker

Even if that kind of test would be better for the students for some reason, it's still no good if it can't be scored consistently.


Consistency is remarkably low on most peoples' priority lists for exam systems. As long as an exam allows us to label a (sufficiently small) number of people as "best" people don't really care if they are actually any better than the people who were not so labeled.
Sunnyskywalker at 23:30 on 2012-05-10
I think I actually make more mistakes trying to do arithmetic than I did doing calculus. They're (somewhat related but) different skills.

People get so smug about their test scores, for such spurious reasons! It just makes me want to tell them that in fact they have no way of knowing the examiners didn't just give them higher scores because they had a good lunch and were feeling more generous than when they scored the kid's classmates that morning. You have no way to be confident that you are one of the "best" people! Sadly, this might actually throw some kids into an existential crisis.

And actually, given how rigorous the procedures are now to keep everyone scoring consistently, it makes me wonder whether the tests might be scored more consistently and thus be slightly better benchmarks for whatever than in 1953. For all we know, they used the "throw them down the stairs and give top scores to the ones that make it all the way to the bottom and then go have another martini" method.
I have known several people who wanted to be elementary school teachers because they weren't good at math and thought teaching lower grades was a good way to avoid doing too much of it.

There are education majors who will say things like "Why do I have to learn algebra when I'm only going to be teaching second grade?" They not only don't know algebra, they *don't want to learn it,* and they're going to be introducing young children to math with that attitude towards learning.

Obviously I can't talk about the US system, I just get very twitchy around the idea that mental arithmetic is the primary measure of effective maths teaching. I appreciate that people often use it as a shorthand, but I meet a lot of people who really do seem to think that the fact that they can divide 428 by 7 in their heads makes them smarter than people who can't.

I'm referring more to American college students who aren't entirely sure whether or not 6 x 8 = 48, or couldn't tell you off the top of their heads what 7 x 7 might be. I once argued with a student who wanted to know why he should worry about what 7 X 7 is when he had a calculator to do that for him. It's the attitude I don't like, more than the results.
Arthur B at 16:20 on 2012-05-11
There are education majors who will say things like "Why do I have to learn algebra when I'm only going to be teaching second grade?" They not only don't know algebra, they *don't want to learn it,* and they're going to be introducing young children to math with that attitude towards learning.

I can only speak from a student's-eye-view, so question to the education professionals in the conversation: would it be fair to say that this has a knock-on effect that goes beyond imparting a dislike of a particular subject? In my experience the teachers who were acting outside of their comfort zone not only made the material in question more painful than it needed to be, but they also had problems maintaining classroom discipline in general - the kids could tell the teacher wasn't feeling confident and they'd always exploit it.

(I had great fun with one chemistry teacher who'd never actually studied chemistry to university level and had a physics background instead - he'd constantly make mistakes on the whiteboard and the more I pointed out the more he made. And then I'd ask him a question about something he hadn't read up on so he had to go next door to ask the teacher who had actually studied chemistry what the answer was. Dickish of me, I know, but if I let it stand he'd have been filling the entire class's heads with crap.)
Michal at 17:29 on 2012-05-11
I once argued with a student who wanted to know why he should worry about what 7 X 7 is when he had a calculator to do that for him.

This isn't exactly a new phenomenon; my dad made the exact same argument in elementary school except he said "abacus" instead of "calculator". And then the teacher beat him. Actually, far as I can tell, his educational experience mostly consisted of being hit with various objects (education in communist Poland FTW!).
This isn't exactly a new phenomenon; my dad made the exact same argument in elementary school except he said "abacus" instead of "calculator".

I can excuse that in an elementary school student more easily than in a university student taking a junior-level statistics course.

I guess there are arguments to be made that students who don't want to learn basic arithmetic shouldn't have to, and maybe there's nothing wrong with that. It's just that it fits in so neatly with the American idea that anything you don't master on the first try is simply not for you and not worth spending any time trying to learn. We're embarrassingly bad at failing and learning from our failures.
Dan H at 22:27 on 2012-05-11
There are education majors who will say things like "Why do I have to learn algebra when I'm only going to be teaching second grade?" They not only don't know algebra, they *don't want to learn it,* and they're going to be introducing young children to math with that attitude towards learning.


I think this is one of those situations where the American system is sufficiently different from the UK system that I'm not sure I'm remotely qualified to comment on it. In England you wouldn't have an "Education Major" - you'd do a teaching qualification after undergrad. Or sometimes not even that (my only qualification to teach Physics is a Physics degree).

I don't know very much about primary teaching, or how that works, but in secondary school you'd very rarely get anybody teaching maths who hadn't got some kind of mathematical qualification. I think UK degrees are also rather narrower than US degrees - in traditional UK universities you don't have a "major", you just study one thing (things are getting more modular but nobody would expect somebody with an English or History degree to know anything about Maths at all). So there's an extent to which "you don't need to know this" is an inbuilt assumption of our university system already.

If the US system requires people to teach subjects they aren't particularly interested in, that seems like quite a major flaw.

I'm referring more to American college students who aren't entirely sure whether or not 6 x 8 = 48, or couldn't tell you off the top of their heads what 7 x 7 might be. I once argued with a student who wanted to know why he should worry about what 7 X 7 is when he had a calculator to do that for him. It's the attitude I don't like, more than the results.


Again, I think this is a slightly peculiar cultural gap. My experience of UK university students is that they either study Arts subjects, in which case they pay no attention to mathematics whatsoever because they have no interest in it, or else they study science subjects, in which case they're only interested in the kind of maths that doesn't involve numbers. I have a very small number of friends who are actually *good* at mental arithmetic, but we see it as more of an occasionally-useful party trick than anything else.

To put it another way, I couldn't tell you off the top of my head whether six eights are forty-eight or not. I could take a couple of seconds and work out that two sixes are twelve, two twelves are twenty-four, and two twenty-fours are forty-eight, but it's a level of mental effort I'd only expend if I actually needed the answer. I do know that seven sevens are forty-nine, but it's something I'm aware of mostly as a kind of random factoid, like I know that the Battle of Hastings was in 1066, or that the mass of an electron is about 9.1 x 10^-31 kilograms, or that the longest word in the English language that uses all of its letters exactly once is "uncopyrightables".

Again, I wonder if this is a cultural difference. I come from an educational background that prioritizes depth of knowledge in a narrow field. Obviously I think basic literacy and numeracy are important, but I don't think there's anything particularly wrong with leaving the mechanics of arithmetic to machines.
Janne Kirjasniemi at 22:51 on 2012-05-11
On the same subject, surely at elementary school levels teachers would have to be able to teach kids basic maths? Teacher's of young kids in Finland(from ages 7-11) are usually pedagogy majors, but they still have to serve at least a month as interns in schools and would have to study the whole curriculum for different subjects to ever be qualified teachers. From age 12 forwards all fields would be taught by people who have a degree in the subject they are teaching, plus a strong minor in pedagogics, or then they wouldn't be qualified teachers(you can still be a substitute or have shorter contract, but you'd never get full pay or tenure as an unqualified teacher).

I remember from my one ywear as a substitute studen in the UK, that it wasn't possible to study subjects outside of the 'major' or how you'd call it. This seems a shame, because here you can have your major in art history and study physics and mathematics as minors if you want to. Ntot that that would be terribly useful as a career move(outside an academic or artistic career perhaps), but as it is, there's not very many subjects you can't do, unless they're restricted for practical reasons(like lab courses in chemistry or biology, or most of medical science). Of course being a student of the nth year is not something one would want to do.
Shimmin at 23:16 on 2012-05-11
Flexibility of UK degrees depends a lot on the university. Some places have a set path for each degree and you just pick between a few approved specialisms towards the end. Others are basically modular. I was at one of the latter, realised I hated half my degree syllabus, and dropped those bits. I ended up studying in four departments and got a degree in three of them. Only timetabling and module prerequisites limited my choices, really, in theory I could have probably studied in five or six departments.

That being said, from the various universities I've seen, the UK still doesn't have the sheer breadth of some other countries. You can't take a little bit of something, you generally pick four to eight modules for a whole year and that's it. There's often a way to pick up extra stuff (language certificates, IT training) through the uni, but it's in addition to the degree, not part of it, and the range is pretty limited. Generally you're looking at taking actual evening classes or what have you.
I do know that seven sevens are forty-nine, but it's something I'm aware of mostly as a kind of random factoid, like I know that the Battle of Hastings was in 1066, or that the mass of an electron is about 9.1 x 10^-31 kilograms, or that the longest word in the English language that uses all of its letters exactly once is "uncopyrightables".

I see what you mean, and mental arithmetic might be a bad example. I do get concerned when I see adult students unable to do things like add fractions or understand the distributive property of multiplication, but there's really no reason they should have remembered these things since school, and it certainly doesn't make them stupid or bad students. What usually happens, though, is that they haven't been given an understanding of math that they'd need to do extremely well on a standardized graduate-level exam, which is what they're paying me for.

And I think there's a very real difference between a student who acknowledges that their math skills are rusty and works hard to improve them, and one who doesn't and expects the universe to right itself when the time comes.

In England you wouldn't have an "Education Major" - you'd do a teaching qualification after undergrad. Or sometimes not even that (my only qualification to teach Physics is a Physics degree).

I'm not a certified teacher, so I don't know exactly how it works either, but my understanding is that anyone with a bachelor's degree in any subject can earn a teaching certificate and be qualified to teach a certain grade. There's also the option to major in something like "early childhood education" as an undergraduate or graduate student, where you study pedagogy theories and child psychology and similar subjects, with varying levels of rigor depending on the institution. (I might be wrong about any or all of this.)
Michal at 01:44 on 2012-05-12
From what I understand, in Canada, Elmentary/High School teaching is covered by the all-important Education degree, with the unfortunate side effect that ed majors end up teaching subjects they know next to nothing about (also known as the dreaded gym-teacher-teaches-history-class phenomenon). Whereas professors don't need any sort of ed degree at all, instead going the more traditional Undergrad-Masters-PhD route. Undergrad degrees are fairly broad here, you can choose a major and a minor, a double major, or an honours degree (in one subject)--I have a BA with a major in history and an minor in English, for instance, though my actual coursework was more broad than that; I also took Physics & Anthropology in first year, Art Fundamentals, Design & intermediate Polish in second to fill up necessary credits.

I can confidently say that I still know all my multiplication tables from 1 to 10 and can still do algebra, though in the six years after high school I have managed to forget nearly everything else. I can't solve quadratic equations, these days, nor do I have the foggiest idea how to do trigonometry any more.

I think I actually make more mistakes trying to do arithmetic than I did doing calculus. They're (somewhat related but) different skills.

This was my experience too, back when I still knew how to do calculus. It was actually quite strange taking calculus the first time in grade twelve and suddenly outperforming the usual math whizzes because the thinking involved seemed closer to something you'd do in the humanities. Or at least, that's how it felt.

I'm kind of sad that I haven't managed to retain anything math or physics or chemistry or biology or French related after putting so much time into those subjects all those years ago.
Sunnyskywalker at 00:10 on 2012-05-13
I don't know about high school, but yeah, I don't think you need relevant subject degree to teach in most cases. They might prefer it, and I think education degrees require you to take at least some basic courses in various subjects so they know you're at least minimally competent, but I'm pretty sure you can teach, say, 8th-grade science or math or history and not have any kind of science or math or history degree. (Foreign languages might be a different story, but I'm not sure there either.) And if you really hate math and can't add 12 and 35 in your head, even teaching 1st grade will not let you avoid math, no. Which means you'll likely be teaching those kids (inadvertently) that math is hard and/or boring stuff.

Probably you will not die if you can't remember 12 x 12, but in general I'd say it's preferable to at least be able to do mental addition and subtraction well enough to know whether the cashier is ripping you off. Whipping out a smart phone and blatantly checking them would just be rude :D

College distribution requirements in the US aren't usually that strenuous in most examples I know of. They might make everyone take a basic composition skills/English class, or let you test out of it, so they can at least be sure your professors in whichever subject you're planning to study will be able to read your papers (or lab reports, or whatever). If you're not in a math or science major, you'll probably have to take a class or two somewhere in that general area; likewise for social sciences and humanities. A lot require a US history, I guess on the theory that probably most people's high school US history taught them nothing, and they ought to know something about the country they live in. On the other hand, I managed to test out of a lot of the distribution requirements by virtue of having done well on Advanced Placement exams in high school, so maybe it takes more time than I'm giving credit for.

I knew one girl who double majored in physics and art, and she says they informed each other in unexpected ways. Not that everyone is a super-genius polymath like her... But still, I do find that random "not my field" stuff turns out to be extremely useful all the time, and I'm with Michal in wishing I remembered more of it.
Sunnyskywalker at 00:11 on 2012-05-13
Meant to specify that that was for the US educational system.
In order to post comments, you need to log in to Ferretbrain or authenticate with OpenID. Don't have an account? See the About Us page for more details.

Show / Hide Comments -- More in May 2012