Bill Jamieson: Pollsters rediscover '˜silent majority'

GENERAL election predictions reveal in-built bias thatvalues the activist over the passive voter, writes Bill Jamieson
The beancounters got the 2015 general election wrong, but will they be able to adjust their models accordingly? Picture: AFP/Getty ImagesThe beancounters got the 2015 general election wrong, but will they be able to adjust their models accordingly? Picture: AFP/Getty Images
The beancounters got the 2015 general election wrong, but will they be able to adjust their models accordingly? Picture: AFP/Getty Images

How wearily familiar has the outcry become over inaccurate and misleading opinion polls. And how fatuous have been the conclusions of the latest inquest.

According to the findings of an independent inquiry released this week, the polls failed to predict the outcome of last May’s general election as they failed to question enough Conservative voters.

Hide Ad
Hide Ad

The sampling methods used resulted, concluded the inquiry in a masterpiece of tautology, in the “systematic over-representation of Labour voters and under-representation of Conservative voters”. Gosh – whoever could have guessed?

And who could suppress an ironic smile on hearing a spokesman for one of the polling organisations assert this week that polling organisations needed more funding for their work, subtly shifting the blame on to those mean-minded people who commissioned the polls. But as matters stand, who would wish to stump up more money for polls that have historically come up with wrong results?

However, while the pollsters have been in the dock, I sense their shortcomings expose a deeper failure to reflect “who the people are” and “what people really think”.

First, let’s deal with those polls. In the run-up to May’s election, the polls had consistently predicted that Labour and the Tories were running neck and neck and that Britain was probably headed for a hung parliament. In the event, the Conservatives secured an overall majority.

The pollsters stand accused of failing to question enough people in retirement and instead relied too heavily on politically engaged young people. “There is also some evidence,” according to one report, “that people who are harder to get in contact with tend to be more likely to vote Conservative.”

Hiding away in cupboards, perhaps? It is less the tendency to vote Conservative that is the defining characteristic of this segment as much as a tendency to respond cautiously to intrusion – whether by strangers questioning them on the doorstep or unsolicited e-mail questionnaires. Older people tend to be more guarded, particularly in an environment where trust is less forthcoming than it was and when barely a week passes without some new example of computer fraud and identity theft.

But this, of course, is by no means the first time that opinion polls have failed to capture the strength of the Conservative vote. In 1970 the Conservative leader Ted Heath led his party to a convincing victory over Harold Wilson in defiance of most opinion polls at the time.

In 1992 John Major won a resounding Conservative victory over Labour, securing a majority that almost all opinion polls had failed to predict.

Hide Ad
Hide Ad

And in the Scottish independence referendum, the majority “No” vote confounded predictions of a neck-and-neck result – and with some polls just a few days previously predicting a victory for “Yes”. Post mortems pointed to the failure of pollsters to capture the strength of the “No” vote among older voters.

What might account for this systemic bias and its persistence? The errors of recent years are more difficult to excuse given the long documented demographic shift and the growing numbers of over-60s in the voting population. Attention throughout the May 2015 general election and particularly the Scottish independence referendum tended to focus on enthusiastic young “Yes” campaigners at rallies and their vociferous participation in television debates and hustings.

Here we come to a broader, persistent bias that is by no means confined to the failings of pollsters and their sampling methods. It is one that is endemic across organisations and public life more generally. This systemic bias tends to treat “activists” as representative of all participants. It favours the active over the passive and the vociferous over the quietists.

This can create the impression that the activists are representative of opinion as a whole. But while they may be more active in opinion – and certainly more active in opinion forming – this does not necessarily mean they are fairly representative of all participants.

From my earliest schooldays I remember classrooms were typically divided into two groups – activists and passivists. The tone of the class was almost always set by the former. This broad division carried on into university where the activists dominated meetings of the students’ union. But it was notable that the larger the audience attracted to debates, the less likely it was that the voice of the activists dominated.

Time and again it has been easy to form the view based on the behaviour and responses of TV audiences that the views of the more vociferous members were reflective of the opinions of viewers as a whole. This assumption has been underpinned by earnest assurances from broadcasters that their audiences were carefully checked and screened to assure that they were representative of the general public.

However, it was not just that, looking at the faces of those taking part that younger members were to the fore, but also that the audience had few representatives of that quieter and less demonstrative voter segment – young or old. What programme producer, after all, would much care for a quiet, passive audience when such programmes work on the basis of animated response – and the stronger the reaction, the better?

Hide Ad
Hide Ad

Take, for example, the controversy over the BBC Question Time audience last week, with accusations that it was dominated by young NHS doctors stepping up their anti-government campaign. So many striking 
doctors packed the audience for this flagship current affairs show that an exasperated host David Dimbleby was driven to appeal for “patients rather than doctors” to ask questions.

The BBC asserted last week that “‘Question Time audiences are always selected in accordance with our guidelines on fairness and balance, and this week was no different”.

But one of the public faces of the campaign, Dr Lauren Gavaghan, was sitting in the front row. And time and again the audience behaviour is clearly seen to be previously committed, noiseome and overly partisan.

This is by no means the first time that the composition of a so-called “balanced” BBC Question Time audience has been challenged. Similar criticism has been made of the sister radio programme Any Questions, where even Jonathan Dimbleby admitted the audience was not balanced - blaming government cost-cutting for leaving the BBC “cash strapped”.

Concern over audience composition and programme styles form part of a broader critique of the BBC and the attitudes of its senior management, producers and presenters.

Reflective of public opinion? The tone of its news and commentary has come to be ever more attitudinal: an increasingly sanctimonious, finger-wagging and opinionated. Coverage of Europe’s refugee crisis depicted Hungary’s Viktor Orban has beyond the pale – until Scandinavian countries followed suit. The UK’s response compared to that of Germany has brought arched eyebrows and flared nostrils from presenters, leading hapless viewers to feel that the crisis was all the UK’s fault.

Given all this it is perhaps not surprising that opinion poll results and methodology have not been more critically assessed until now. For there is a wider failure to grasp the true state of public opinion in the country today. The pollsters are not alone.

Related topics: