Misappropriating Institutional Credibility


The Conservatives have come under fire in the last couple of days over their Twitter debacle. For those fortunate enough not to have followed the events, during the televised debate between Corbyn and Johnson on ITV, CCHQ altered their Twitter profile so as to appear as if they were an independent fact checking organisation, called FactcheckUK. The Twitter 'blue tick' gave the impression that this was a real organisation, verified by Twitter. True, tweets also displayed the CCHQ handle, but this is much less prominent, and many users would not even know who CCHQ are. In any case, it is very difficult to imagine a motive for this change other than to deceive. What's more worrying is this isn't a one off event. It's a pattern of behaviour we are seeing increasingly often, and unless I am mistaken, it is all coming from broadly the same group of people with the same campaign strategists. Conservatives are using either fake institutions or falsely appropriating the credibility of real ones in order to disguise party propaganda as trustworthy independent analysis.



The first examples of this were during the 2016 referendum campaign. Vote Leave, an organisation largely run by Conservative strategists, notably including Dominic Cummings, produced leaflets encouraging voters to support Brexit as a means of helping the NHS. They produced leaflets which used a (slightly changed) NHS logo, using a leaflet format  (font, colour scheme, layout etc) almost identical to those produced by the NHS. These appear to have been distributed in hospitals, with the words 'Help protect your local hospital' on the front. In doing so Vote Leave were misusing the institutional kudos of the NHS to add credibility their material. True, the leaflets also had the Vote Leave logo, and you might think that many people would realise what was going on. But some might not, which is why such deceptions can be effective. Indeed, Vote Leave were called out on this by the Treasury Select Committee during the campaign for this very reason, as can be viewed here:



Another, somewhat less egregious example of this behaviour during the campaign was this leaflet sent to households shown on the right here. The title page reads 'EU Referendum Facts', and the leaflet uses the independent sounding address www.eureferendumfacts.org, but the material was in fact produced by Vote Leave.


This strategy appears to have been repeated after the referendum, albeit with somewhat more organisational effort. In 2017, Conservative MEP and former Vote Leave committee member Daniel Hannan set up the pro-Brexit think tank IFT, whose launch event was attended by Boris Johnson. The organisation was initially called the Institute for Free Trade, but the organisation had to change its name. This is because the word 'Institute' is a protected title only given to reputable academic institutions publishing high quality research. The organisation managed to keep its acronym by changing their name to the Initiative for Free Trade. Once again, the effect of the exercise would have been to use misleading naming or presentation to pass of partisan material as that of a reputable, non partisan organisation.

There is a certain irony to all of this. Michael Gove famously said in 2016 that people had 'had enough of experts' from institutions 'with acronyms', and amongst many this remark has become symbolic of a particular style of politics. But in reality, those who practice that style seem to be fully aware of the power and kudos that can come with expert analysis backed up by trusted and respected institutions. This is why the strategy they seem to be pursuing is so dangerous.


Attitudes towards freedom of movement

A couple of days ago I spotted a rather surprising result on a YouGov poll: by an enormous margin, respondents said that they would like to maintain reciprocal rights of UK and EU citizens to live and work in each other's countries. Of those polled, 67% said such rights should be maintained, 15% said they shouldn't, and 19% said they did not know. In other words, what is generally believed to be the key motivation for Brexit, ending freedom of movement, is not wanted. These results got rather more attention on Twitter than I anticipated, and a number of people responded that they are merely a reflection of how the question is posed: if the question is framed in terms of 'control' of immigration, or simply in terms of support for "freedom of movement", the results would be quite different. I don't doubt that this is correct, but I wanted to briefly respond to this point, as I think it is an important one.

It is famously true that opinion polls can generate multiple answers to what is essentially the same question dependent on how it is framed. There are two possible responses to this. The first is simply to disregard the idea of popular opinion and see it purely as an ephemeral product of how questions are posed. Sometimes this may well be a reasonable assessment, and I think there is definitely such thing as a healthy skepticism around these issues.

But the second response is to say, OK, so what question is the most useful and informative? This of course depends a lot on what you are using the question for. If you are a political strategist, you might well find it very useful that questions which frame freedom of movement in terms of 'control' yield a negative reaction. But if you genuinely interested in trying to get an honest sense of public opinion as a means of guiding policy, the question has to be: "which way of describing freedom of movement is the most honest reflection of what the policy actually is?" No doubt this is itself contested, and there are occasions where the appropriate way of understanding a policy is genuinely a difficult question. But with freedom of movement, I don't see how framing the policy as a reciprocal right (the right for citizens to live and work in one another's countries) is not just the most simple, accurate, and thorough description. The alternatives either simply omit that reciprocity or present fictitious alternatives (one way rights for UK citizens, or systems of 'control' which imply much more effective and benevolent bureaucracies than actually ever exist).

For what it's worth, I don't think that public opinion on these questions should be the sole guide to policy. Stripping large numbers of people of acquired rights might well be wrong regardless of whether it is popular to do so. But it seems tragic for people to lose these rights on the basis of a popular support that does not even exist.

O Level and GCSE Maths Compared


When people of a certain age talk about GCSEs, the conversation may well quickly turn to the old ‘O Level’ qualification, and, more often than not, how much harder these were than anything 16 year olds are expected to do today. Part of the problem with these conversations is that few people are in a position to make a real comparison. Most people have only ever seen one type of exam, and if it is the O Levels, this was a long time ago. Perhaps for that reason, when I was a teenager (I was a weird teenager) the old exams had a certain mystique. As somebody who got a little too much of their sense of self through exam grades, I couldn’t help but wonder how I would have done at them.  While I’ll never know the answer to that question, thanks to the wonders of the internet, it’s certainly possible to compare the two exams, if you are geeky and obsessive enough to do so. What follows are the results of that endeavour with the old Maths O Level, and an attempt to answer that all important question: how much harder was it really?

The first thing to mention is that the O Levels (at least the exams from the ’50s and ‘60s) feel in many ways like very dated exams. They are densely printed, long, and in terms of structure feel a lot more like university exams. They have two sections, one of short questions and one of longer questions of which you choose a few (GCSE Maths papers don’t involve any choice).



Some of the material is very much from a time when people didn’t have calculators. There are lots of questions which are clearly testing your ability to use logarithmic and trigonometric tables and others which ask you to approximate π as 22/7 (at GCSE you’d either just put it in a calculator and round, or give your answer as a multiple of π, but it's still a nice approximation to use for rough calculations). There are then arithmetic problems which require some numerical manipulation to solve, often better thought of first as algebraic expressions to simplify, such as these ones here:


You can see why there would have been a greater focus on numerical skills before calculators, but I think it’s a bit of a shame this kind of numerical manipulation isn't taught as much anymore, as it can help children with algebra.  Part iii) of this question is also a reminder of another pesky feature of these exams: old, imperial units and pre-decimal currency (I couldn't answer part iii without looking up how many shillings there were in a pound!)  

That said, there is a lot about the exams that hasn’t changed at all. The algebra is pretty much the same, both in terms of content and level of difficulty (if we are comparing the O Level to the Higher Tier GCSE, after Gove’s reforms. The Foundation is dramatically less challenging, and prior to Gove’s reforms the GCSEs were considerably less difficult). This question, for example, taken from a 1962 O Level exam could have come straight out the GCSE were it not for the font: 
 



There are a few algebra questions in the O Levels which do seem a little more inventive in terms of what is asked. Question 4 (ii) below, for example, requires students not to solve for a single unknown but a quotient of two. That said, how difficult this kind of problem is very much depends on whether you have been taught to do it before or seen it for the first time, and I haven’t seen enough papers to know if this was a standard problem.


The section B algebra problems also seem to have required a little more in terms of initiative and personal input to solve. This one here, for example, I could only solve easily by introducing two unknowns of my own (I used n for the initial number of items sold and N for the total) which you could then get rid of by rearranging. 

The GCSE does require algebraic proof and introducing an unknown to solve numerical problems, but not introducing several unknowns to derive a purely algebraic expression which does not contain the variables you’ve introduced. An example of the type of problem tested at GCSE is here, which you solve by forming a quadratic (e.g by setting the number of green pens as x and blue as x+ 3). 



One way in which the O Level was definitely more challenging was the geometry. GCSE geometry questions tend to provide diagrams, the O Levels required you to figure out yourself what they would look like based only on worded descriptions. This question here, for example, could easily be a GCSE question, but there is no way they would ask you to do it without a diagram provided.


The proof questions were stylistically different, but don’t seem all that much more challenging. The main difference is volume: the O Level papers seem to have a lot of geometric proof involved, whereas on the GCSE there will only be a couple of questions. Below, for comparison, are two geometric proof questions, first a 1968 O Level and below that a 2017 GCSE question.





Content wise, the other main difference is that the O Level was in some ways a narrower exam. In the 1950’s and 60’s papers I didn’t find any statistics, data handling or probability questions, and very little on volumes, number properties, primes factor decomposition or surds. Algebra, geometry and doing long calculations with tables and slide rules seemed to be the greater focus. That said, the O Level did require knowing some basic calculus, and apart from the Edexcel IGCSE this isn’t taught until A Level (and even with the IGCSE there is no integration). And one exam I found from 1957 also had a rather charming third paper on the history of mathematics. I don’t know if this was a common part of the curriculum or not, but there is certainly something wonderfully quaint about it.



So, all in all, was the O Level Maths exam more difficult than the GCSE? Compared to the exams I sat, pre Michael Gove, I regret to say the answer is certainly yes. Compared to the exams students sit now, I’m less certain. There are certainly fewer gift questions (even the higher tier GCSE still has a some very easy questions at the start of the exam, the O Level seemed to have fewer, and they were less easy). On balance, I would still conclude that the O Levels were a little more challenging, if nothing else because they required more personal input and initiative to do start the questions (introducing your own variables, working out what a geometry problem looks like based on a description). But they aren’t a million miles away from one another.