The Arithmetic of Belief

Introduction

Our society is built on the art of persuasion. Capitalism relies on convincing consumers to turn their money into goods. In democracies, the best ‘spinner’ of the truth gains power, and within businesses successful change relies on convincing people (staff, suppliers etc) to do things differently.

How to persuade has been studied for at least three thousand years - from the rhetoric of Athens with its appeals to logic, emotion and authority, to the more complex approaches used today where cognitive filters are modelled and bypassed. But most people have never looked at the simple arithmetic of persuasion – of how belief spreads through a population.

So in this edition of
Theopraxis Thinking, I want to answer three simple questions:

  1. How do you measure the infectiousness of an argument?
  2. Is it better to be true, or to be interesting?
  3. In a world of competing information, how do beliefs stabilise?

Infectiousness

Let’s say that I have a belief that I want to pass on to others (such as ‘the spread of belief can be mathematically modelled’). So I express it via some form of medium (speech, newspaper, television etc), and hope that you will both hear/see the argument and accept it. We can therefore model this person-to-person infection in four stages:

  1. Repetition: I express the belief, once or many times, in some form of medium
  2. Reach: the message is seen by a number of people, the quantity depending on the medium
  3. Recognition: You notice the message amid the noisy confusion of life
  4. Realisation: You think about the message and accept it, thus making it one of your beliefs.

et voila, I have a convert, and that person is ready to transmit the belief to others.

If I telephoned each of you individually, repetition and recognition would be fairly high (as would realisation: I’m a persuasive bloke), but reach would be only one person per expression. However, if I took an advert in a newspaper every day for a year, then repetition and reach would be huge but recognition probably relatively low because the ‘richness’ of the message is limited.

The infection rate of my belief is given by:
  1. The number of times I tell someone about it (0 to )x
  2. The number of potential new hosts reached per expression (0 to )x
  3. The probability of the message being noticed by each recipient (0-100%)x
  4. The probability of the message being retained by each recipient (0-100%).
I send out about 200 copies of this email and I know that most (75%?) of you actually read it. So if half of those who read it believe it, then I will have spread the idea to (1 x 200 x 75% x 50%) 75 new people. Not exactly Goebbels but, hey, it’s a start…

Repetition, use of mass media and compelling packaging of ideas works well, but it can only go so far. The trick with the last stage – realisation - is to fool the brain’s perceptual filters into accepting that the idea is true. There are a whole number of ways of doing this which I won’t go into here. except to say that we tend to accept things which fit with our current set of beliefs and reject things which don’t.


Interesting vs True

Let’s say that I get ten of my friends together and spread an idea. This idea is so beautiful and so self-evidently true that they are all captivated and just have to spread it themselves. So every tenth day they each contact ten people who haven’t heard the idea, and tell them about it… and every tenth day all of these new converts do the same. Because it’s true, no-one ever defects from the belief and they just keep on spreading the good news. How long would it take for the idea to spread around the world? If you don’t want to do the maths yourself (11 x 11 x 11 x…) you’ll just have to trust me when I tell you it will take about three months – nine or ten iterations.

Now let’s think of a piece of gossip – say, the one about the Queen and the inflatable equerry. This one is really hot property, and everyone who hears this rumour will want to spread it to one uninfected person every morning. But, on reflection, most people will realise that it can’t be true… so let’s say that only one person in a hundred continues to spread the rumour for more than a week. People can be re-infected (and they spread the belief for another week), but with a 99% defection rate you would expect this piece of news to spread very slowly. In fact, the repetition rate is so much faster than the defection rate that in just 35 days the entire population of the world believes in the story.

So if you want to spread a belief the trick is to make it virulent rather than long-lived. This bears out our everyday experience – catchy songs and interesting lies go round a population much more readily than difficult but important concepts.

But we live in a sea of conflicting and constantly repeated messages, and people who change their minds in the face of media bombardment. We find that, in the real world, beliefs reach a saturation point within a population. And that brings us to…


The Stability of Belief

The importance of understanding why beliefs stabilise can be illustrated by the following painful example. In March 2004, a year after the US-led invasion of Iraq and (crucially) a few months before a US presidential election, a survey of American voters showed that

  • 57% believed that Iraq had been directly involved in the attacks on 11th September 2001, or had been giving ‘substantial support’ to Al Q’aeda prior to those attacks


  • 60% believed that Iraq possessed, or had been actively developing, weapons of mass destruction prior to the invasion.

Even though the first proposition had never been true and it was becoming painfully clear that the second proposition wasn’t true either, those percentages had been stable since before the start of the war in Iraq. Given that voting intentions in the forthcoming election were strongly linked to beliefs on the rightness of the war, these beliefs probably accounted for the re-election of George W. Bush by the narrowest of margins. Albeit to a much lesser extent, the same thing happened in the UK.

How did these beliefs reach such high, and stable, levels?

This is how. Suppose we have a simple proposition - say, the existence of Weapons of Mass Destruction (WMD) in a certain country. Every night, on the evening news, the population is exposed to three statements:

  1. our leader, a charismatic figure that you believe to be truthful, maintains that the country possesses WMD

  2. the man who runs that country - a complete fink – tells you that he doesn’t have WMD

  3. an honest but completely uncharismatic man tells you that he has been diligently looking for WMD but has find no trace of them so far.
Let’s assume that a person can believe one of three things: either “Y”, the WMD exists; “N”, WMD doesn’t exist; or ”DK”. they don’t know whether WMD exists or not but won’t support a war until they are sure. If we ran a survey every night before the evening news we could identify the proportion of the population that had each belief. Let’s assume that the starting proportions are:

10% believe the WMD exist (‘Y’)
20% believe the WMD do not exist (‘N’)
70% don’t know whether the WMD exist or not (‘DK’)

We also know from that a certain percentage of people will change their minds each time they are exposed to the arguments (by the way, the percentages used here are for illustration only and bear no relation to actual views at the time). Let’s say that the defection percentages are:

Has belief

Defects to ‘Y’
Defects to ‘N’
Defects to ‘DK’
Yes (Y)
95%
1%
4%
No (N)
5%
80%
15%
Don’t Know (DK)
10%
5%
85%

The first row says that 95% of people who believe the WMD exist will continue to believe this after the evening news, that 1% of them will be convinced by the counter-argument, and that 4% of them will move to the “Don’t Know” position. Similarly, most people who do not believe in WMD (or who don’t know) will continue to hold that view after each exposure to the evening news. Notice that most people maintain their beliefs and that each row adds up to 100%, meaning that everyone holds one of the three possible beliefs.

So after the first night, we know that the number of people who continue to believe in WMD is 10% (starting population) x 95% (retention percentage), or 9.5%. However, the number of people who
now believe that they exist is 20% x 5% from the “No” camp plus 70% x 10% from the “Don’t Know” camp. In short, the total number who believe in the WMD after the first night is 9.5% + 1% + 7%, or 17.5% - a substantial increase. Similarly, the number of people in the “No” camp is down slightly at 19.6% and the number of “Don’t Knows” has dropped to 63.4% as our charismatic leader works his magic.

Belief curves - original switching percentages
Every night, our population hears the same messages - from our beloved leader, from the fink and from the honest but bland foreigner - and makes their mind up anew. The result after 30 days can be seen in the graph on the left.

As you can see, the number of people that believe that the WMD exist has risen rapidly from 10% to nearly 70% and has levelled off. Meanwhile the agnostics have slumped and so it’s off to war we go.



Belief curve - different start points
Here’s a surprising thing: let us now suppose that the starting values of the beliefs were 0% who believed in the existence of the WMD, 90% who didn’t and 10% who weren’t sure – a radically different starting position. The graph on the right looks at the probabilities of each belief, again over one month.

While the graphs look different in shape and the “Don’t Know” enjoyed a brief surge, the end result is exactly the same – about 70% believe in WMD, 7% disbelieve and about a quarter still don’t know. The dramatically different starting positions did not make a difference after the first three weeks.

Why didn’t the different start points make a difference? If you go back and look at the defection percentages, you will see that there is a rapid drift from ‘”No” to “Don’t Know” and from “Don’t Know” to ‘Yes” against a much slower drift in the opposite direction. The stable end state comes about because – and I’m sure we all remember this from our mathematics courses (ahem) – we reach a Markov Chain Equilibrium.

The key point here is that if the probability of changing beliefs is stable then the end state depends
entirely on the defection probability and not at all on the initial state. It may take longer to get there, but the end state is the same.

Now let’s look at what happens if we make a minor change to the defection percentages (previous values are shown in brackets).

Has belief

Defects to ‘Y’
Defects to ‘N’
Defects to ‘DK’
Yes (Y)
93% (95%)
1%
6% (4%)
No (N)
5%
80%
15%
Don’t Know (DK)
5% (10%)
5%
90% (85%)

In this case, two people in every hundred are less susceptible to the charisma of our beloved leader and defect to the neutral “Don’t Know” camp, and they are more likely to stay in the neutral camp once they arrive. The graphs below show the difference this relatively small change makes to our equilibrium. The effects of the original set of percentages are shown on the left and those from the new set are shown on the right.

Old switching percentages

New switching percentages


So a relatively small change in the proportion of people who change their mind makes a huge difference in the belief equilibrium.

One final experiment: what if the defection percentages change part-way through? The graph here shows what happens if the first set of behaviours is followed for 30 days, and then something happens to change the public’s behaviours to the second set.


Wow! As expected, we end up at the new equilibrium, but look how quickly public opinion shifts! Within ten days, the doves outnumber the hawks.

If you want to prevent a radical change of mind in your population, then when you spread your beliefs you should bundle in an immune system to protect the belief and prevent defection… but that’s a story for another day.

Lessons for Advertisers and Change Managers

Without re-iterating how important persuasion is to our entire way of life, I hope you can see that this is important stuff (and I hope you haven’t heard it before). The implications for advertisers are pretty obvious, but I want to point out that change management initiatives succeed or fail depending on whether the target population receives and believes (and continues to believe) your messages.

So, here then is Farncombe’s seven-point plan for getting your message across.

  1. Repeat, repeat, repeat
  2. See point 1
  3. Understand the trade-off between richness and reach. If the message is simple, go for reach
  4. Understand, and build on, what people already believe
  5. True is good, virulent is better
  6. Remember that what people believe now is no indicator of what they will believe in future.
  7. By all means at your disposal, encourage belief and discourage defection

Now go forth, and spread your memes!


Acknowledgements


  • The four-stage model of belief transmission is based on the work of Professor Francis Heylighen of the Free University of Brussels.

  • The Markov chain is based on the writings of Stanford’s marvellous Professor Sam Savage. If this stuff interests you, try his ‘Decision Making with Insight’.

  • The survey on US public attitudes to the Iraq war was published by the Program on International Policy Attitudes (PIPA), and it consisted of a survey of 1311 respondents across the USA. Available here (retrieved 03-07-2007).

If this post has interested you, you might be interested to know I have recently written a book on memes in business, “The Success Virus”. You can find out more about it here.