2

Lies, damned lies and facts

Why people ignore facts to believe what they want to photo of three wise monkeys

It’s an inconvenient truth that facts don’t convince people. In fact, there is evidence that facts can work to convince people to believe the exact opposite, even more strongly than they did before.

In the UK’s EU referendum the Leave campaign led on the lie that the UK sent £350m a week to the EU. It is perhaps the most blatant example in recent democratic politics of a campaign leading on a lie, even in the face of facts from experts, then still going on to triumph. It perhaps paved the way for Donald Trump whose campaign and subsequent behaviour in the White House has taken lying to a whole new level.

It seems that facts no longer matter. “Post-truth’ was the Oxford English Dictionary’s ‘word’ of 2016. Trump adviser Kellyanne Conway coined the new phrase of “alternative facts”.

Big tobacco’s obfuscation

Woman with cigarette photo

However, it’s not just a recent phenomena. In the 1950s the tobacco industry faced devastation as scientists published irrefutable evidence of the link between smoking and cancer.

It didn’t finish the industry off. Instead the tobacco industry embarked on a process of obfuscation. It disputed indisputable facts. It questioned unquestionable experts. It challenged the unchallengeable.

It was also one of the public relations industry’s darkest hours. In 1953 senior tobacco industry executives met John Hill, founder of PR agency Hill & Knowlton, at the Plaza Hotel in New York. At this infamous meeting Hill is reputed to have advised the tobacco industry to create a scientific institute to refute the scientific research and create the impression that its products were safe. The Tobacco Industry Research Committee was born. In today’s PR jargon it was a blatant example of astroturfing – creating an artificial grassroots or quasi-independent organisation to deliberately mislead the public, media and politicians.

In today’s world of professional public relations astroturfing is clearly outlawed as unethical in the codes of practice of industry bodies such as the UK’s Chartered Institute of Public Relations (CIPR) and Public Relations and Communications Association (PRCA) as well as the Public Relations Society of America (PRSA).

The tobacco industry ran the fake research institute for almost fifty years before legal action finally forced its closure. As recently as 1994 the chief executives of the seven largest tobacco companies gave evidence under oath to the US Congress that they did not believe their products were addictive. They have never been punished for lying.

It’s ethical to defend your industry, but don’t lie doing it

The tobacco industry isn’t alone and today there is perhaps questionable practice from the sugar industry, the fast food industry and the oil and fossil fuels industries. There is nothing unethical about these industries campaigning and lobbying to protect themselves from legislation that will harm their shareholders, customers and employees. However, it does become unethical if they do want big tobacco did and deliberately lie.

The behaviour of big tobacco even sparked a whole new field of academic study. In 1995 Stanford University historian Robert Proctor coined the word – agnotology. It comes from agnosis, the neoclassical Greek word for ignorance or ‘not knowing’, and ontology, a branch of metaphysics which deals with the ‘nature of being’.

Agnotology is the study of wilful acts to spread confusion and deceit, usually to sell a product or win favour.

However, the problem runs deeper than nefarious campaign groups and corrupt big business. Facts are in so much trouble that it makes it harder for honest, ethical campaign groups and the majority of law abiding, ethical businesses to get their messages across.

The instinctive reaction of public relations professionals faced with lies and untruths is to immediately rebut them with the truth and facts. It’s also the instinctive reaction of legitimate politicians, academics, journalists and indeed concerned and interested ordinary citizens.

The problem is there are numerous academic studies showing that facts aren’t necessarily what convinces people to believe something and are even more unlikely to persuade them to change their minds.

Rebuttal should never repeat the lie

Psychologists Hollyn Johnson and Colleen Seifert ran an experiment in 1995 where people read reports of an explosive warehouse fire. The report mentioned petrol and paint cans, but later explained that neither petrol or paint were in the warehouse. When questioned about the report they’d read people remembered the paint wasn’t there, but later when asked to explain the facts about the fire and why there was so much smoke would mention the paint. Even though they already knew it was wrong they considered it because they didn’t have an alternative explanation. The study is meant to show that once people have heard an untrue campaign they can’t simply ‘unhear’ it and forget it.

That’s why one of the rules of rebuttal is never to repeat the false claim. Even if it’s repeated to rebut it with the facts the repeating of it can make it stick in peoples’ minds. For others the repeat and rebuttal might also be the first time they’ve heard the claim. The danger is that as memories fail the lie is the only thing that is remembered as the lie is the part that has been repeated and heard the most.

One of the mistakes of the Remain campaign was to repeat the lie about £350m for the NHS when rebutting it.

Joseph Goebbels photoNazi propaganda chief Joseph Goebbels is reputed to have said:

“Repeat a lie often enough and it becomes the truth”

He’s right. And there are psychological experiments to demonstrate what is known as the ‘illusion of truth’ effect. In the original experiment participants were given a series of trivia ‘facts’ and asked to rate them true or false. The trivia was things the student participants were very unlikely to know such as “Basketball became an Olympic sport in 1925”. The experiment was repeated twice more with a gap of two weeks. Each time the participants were given 60 trivia statements, some true and some false. In each experiment 20 trivia statement appeared on all three lists and 40 were different. The participants’ confidence in the non-repeated trivia ‘facts’ was constant, but their confidence in the truth of the repeated statements increased each time, regardless of if the trivia ‘fact’ was true or false.

The backfire effect

The psychological theory of cognitive dissonance is about the tension when people are presented with information that contradicts their existing beliefs. Cognitive simplicity is the theory that when our brains process information that belief comes quickly and it takes longer for people to process information sceptically.

In practice the combination of cognitive dissonance and simplicity means that if presented with facts that are contrary to existing strongly held beliefs then people are more likely to believe what they already believe, despite the factual evidence to the contrary.

It’s known as the ‘backfire effect’ and has been researched by a series of experiments by Dartmouth College professor Brendan Nyhan and University of Essex professor Jason Reifler. Their research indicated that correcting facts actually increases misconceptions amongst the group of believers.

In one experiment participants were given fake newspaper articles that confirmed that there were weapons of mass destruction (WMD) in Iraq. When participants were given real articles that said WMD were never found, people who opposed the war accepted the new article. People who supported the war did the opposite and became even more convinced there were WMD, despite now having the factual evidence there weren’t any WMD.

If facts don’t work what can PR professionals do about it?

Facts are the truth and we might as well all give up now if the truth doesn’t work.

I believe the fact is that facts alone don’t work. It doesn’t mean we should give up on facts and the truth.

Some of the things that public relations professionals can do are:

  • Be first – don’t wait for people to base their beliefs on what others say, but frame the issue first.
  • Let others speak for you – create content that people can discover and share themselves.
  • Listen and understand – see if you can ‘triangulate’ the issue and discover common ground as the first step. See if you can show that believing facts doesn’t always necessitate changing deeply held beliefs, but merely adapting them.
  • Be emotional – this will be the subject of another blog post as cold, hard facts don’t convince or persuade on their own as too often they are trumped by feelings.

If you want to learn more about the changing nature of modern public relations and communications then please get in touch to discuss how I can help via my PR consultancy and PR training.

This article draws on numerous sources including:

Stuart Bruce

International Public Relations Adviser | Trainer | Author | Media Commentator | Conference Speaker | University Lecturer | Online PR | Digital Corporate Communications | Crisis Communications | Digital Public Affairs