Skip to main content

Loaded question: opinion polls and Britain at war

Ian Sinclair speaks to LILLAH FEARNLEY about her research on the power of opinion polls on government decisions such as military intervention against Syria — and how the powerful shape this critical research

IN May, Rethinking Security, a network of organisations, academics and activists working for a just and peaceful world based in Britain, published Lillah Fearnley’s major new report Thinking Inside the Box: How Opinion Polls Shape Security Debates and Policy in the UK.

An independent consultant specialising in research on conflict, peace, security and peacekeeping, Fearnley spoke to me about her key findings, including her analysis of surveys done on British intervention in Syria and her recommendations for future polling.

 

Why is opinion polling important to security debates and policy-making in Britain?

 

Public opinion polls or surveys, particularly high-profile ones that are picked up by or commissioned by the media, can generate debate around security issues and influence policy-making.

When politicians invoke public opinion to justify policy decisions, they often cite opinion poll results.

But this relationship is complex in that governments seek both to influence and respond to public opinion and it may, for example, be used to legitimise decisions after the event rather than as a means to “listen” and respond to the public.

Public opinion polling has the potential to influence real-world security decisions by the government, sometimes in a critical moment when government or Parliament are in the process of debating them.

For example, politicians may invoke public opinion, as expressed in polls, to justify or oppose British military intervention. When the results of polls make a media splash, they may also influence the public imagination and debate around these issues.

This being the case, how a situation is characterised, including who or what needs securing, what is deemed a security threat, and what policy responses are therefore legitimate and “reasonable” can be hugely significant.

Those asking the questions, polling companies and their clients, have significant power to influence how the issues are presented and shaped.

Once survey findings are published, we don’t necessarily stop to think about how questions were asked, what options were given (or withheld) and why, who was asked, or how the issues were framed; we just see the statistic.

 

How does opinion polling shape the national debate when it comes to security issues?

 

The focus and framing of questions on security shape the responses given and can therefore circumscribe the picture provided by the survey.

For example, the majority of surveys I reviewed for my research frame security in terms of threats to Britain’s national interests, which are rarely defined in any detail, rather than to people or communities or the planet.

This is done almost exclusively through closed questions in which respondents evaluate the severity of a list of predefined threats. The listed threats largely reflect the priorities of policy-makers.

The evidence suggests, however, that when asked open-ended questions about the top security challenges, the responses of the public diverge significantly from these priorities.

Threats that disproportionally affect the security of minority and marginalised groups such as racism, far-right extremism, and sexual or gender-based violence are mostly absent.

Some surveys include minority or marginalised groups, for example, refugees from conflict zones, among lists of possible threats. This begs the question: whose perceptions of security are prioritised? Whose voices are reflected?

It also raises questions about who identifies the list of potential security issues, and who decides what can or cannot be deemed a security threat?

Do the security issues presented reflect the general public’s perceived sources of insecurity, or the existing concerns of those who commission the research and set the survey questions? It appears to be the latter.

The way that security threats or challenges are framed (who or what needs protecting from whom or what) is likely to influence what approaches or responses the public deems appropriate and legitimate for building or maintaining security.

If security was defined differently — for instance as “a shared freedom from fear and want” — a very different picture of public opinion may emerge.

Overall, there is a far greater focus on militarised responses to building and maintaining security across the surveys than non-military responses to conflict and crises.

Peace-building or dialogue are rarely explored in high-profile public opinion surveys and when they are it is through hypothetical questions in surveys commissioned by peace-building NGOs.

 

Can you talk about a real-world example of when polling restricted the debate in Britain on foreign intervention?

 

Of the 16 surveys on British intervention in Syria between 2012-18 that I reviewed, only six tested public support for non-military approaches to the situation such as diplomatic pressure through economic sanctions, alongside military intervention or support options, although five included the provision of humanitarian assistance to civilians.

None of these surveys presented dialogue or mediation as a possible means of resolving Syria’s conflict.

The omission of non-military intervention options means that awareness of, let alone support for, alternative non-military responses is excluded from the debate at a critical moment when the government or Parliament are making decisions.

While the polls on British intervention in Syria during this period did not show a high level of public support for military intervention, the absence of non-military options may have implicitly limited the debate through reinforcing a belief that the use of military force is the only effective response.

Polls that exclude non-military options create a false dichotomy between intervening militarily and doing nothing, which is likely to steer public opinion towards the former.

 

You conclude the majority of opinion polls on security issues “are designed and frame their questions in terms of the prevailing discourse.” Why do you think those who commission polls — often newspapers — and the polling companies themselves end up doing this?

 

I think this is in part because any public opinion research, whether initiated by the corporate media, commercial polling companies or indeed organisations that are not part of the Establishment, such as NGOs, is likely to reflect the underlying assumptions and interests of those selecting and framing the questions. It is likely to also be down to who they talk to, or more importantly listen to, in setting the questions.

I asked YouGov about this process in relation to survey questions on intervention in Syria in 2013. They explained that the options presented in closed questions are usually driven by the options that are being reasonably discussed at the time.

They said that the client, for example a newspaper, may make suggestions for question wording and answer options, but the final wording is signed off by YouGov researchers and any options that are seen as irrelevant or leading are removed.

This raises the question of who are those discussing what sort of intervention is a reasonable and legitimate response? Is it the general public or is it “security elites,” those considered to be experts on the issues?

I suspect, though I can’t say for sure, that it is the latter. This gives an insight into the way in which public opinion polling may reproduce elite or dominant narratives on security through reflecting back the prevailing discourse.

 

How do you think opinion polling should change?

 

When designing survey methodology and questions, pollsters and others involved should ensure that they are not simply seeking public endorsement of elite or government-identified definitions of, and priorities for security, but rather that they are seeking out the full range of perspectives and priorities, including through exploring the public’s own understanding of “security” and threats to it so that responses are not necessarily limited to state-based definitions.

For public opinion surveys to provide a richer, more diverse and more comprehensive picture of Britain's security perceptions, it needs to be designed in a way that facilitates the inclusion of the perspectives of a broader range of security stakeholders.

This depends not just on representative sampling but also on proactively eliciting the views of minority and marginalised groups. It also means ensuring that the wording of questions does not alienate respondents, and that response options presented in closed questions reflect the breadth of potential security concerns of a diverse British public.

I think it’s crucial that public opinion polling on security recognises the heterogeneity of public experiences of security and insecurity and builds on good practice methodologies for capturing this, some of which I’ve highlighted in my report.

Rethinking Security is applying the lessons from the study in the surveying methodologies used in its own Alternative Security Review.

To read Thinking Inside the Box: How Opinion Polls Shape Security Debates and Policy in the UK and find out more about the Alternative Security Review, visit rethinkingsecurity.org.uk.

Follow Ian Sinclair on Twitter @IanJSinclair.

OWNED BY OUR READERS

We're a reader-owned co-operative, which means you can become part of the paper too by buying shares in the People’s Press Printing Society.

 

 

Become a supporter

Fighting fund

You've Raised:£ 3,793
We need:£ 14,207
27 Days remaining
Donate today