How to Make the World Add Up by Tim Harford

Banner image: The Bridge Street Press

Like Comment

This article is part of a series in which OECD experts and thought leaders from around the world and all parts of society address the COVID-19 crisis, discussing and developing solutions now and for the future. It aims to foster the fruitful exchange of expertise and perspectives across fields to help us rise to this critical challenge. Opinions expressed do not necessarily represent the views of the OECD.

To keep updated on all of the OECD's work supporting the fight against COVID-19, visit our Digital Content Hub.


Excerpted from How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford. Copyright © 2020. 

All the statistical expertise in the world will not prevent you believing claims you shouldn’t believe and dismissing facts you shouldn’t dismiss. That expertise needs to be complemented by control of your own emotional reactions to the statistical claims you see.

In some cases there’s no emotional reaction to worry about. Let’s say I tell you that Mars is more than 50 million kilometres, or 30 million miles, away from the Earth. Very few people have a passionately held belief about that claim, so you can start asking sensible questions immediately.

For example: is 30 million miles a long way? (Sort of. It’s more than a hundred times further than the distance between Earth and the moon. Other planets are a lot further away, though.) Hang on, isn’t Mars in a totally different orbit? Doesn’t that mean the distance between the Earth and Mars varies all the time? (Indeed it does. The minimum distance between the two planets is a bit more than 30 million miles, but sometimes Mars is more than 200 million miles away.) Because there is no emotional response to the claim to trip you up, you can jump straight to trying to understand and evaluate it.

It’s much more challenging when emotional reactions are involved […]. Psychologist Ziva Kunda found the same effect in the lab, when she showed experimental subjects an article laying out the evidence that coffee or other sources of caffeine could increase the risk to women of developing breast cysts. Most people found the article pretty convincing. Women who drank a lot of coffee did not.

We often find ways to dismiss evidence that we don’t like. And the opposite is true, too: when evidence seems to support our preconceptions, we are less likely to look too closely for flaws. […]

Also on the Forum Network: Lessons in resilience from extreme economies by Richard Davies, economist and author based at Bristol University and the London School of Economics

In another experiment, students had a blood sample taken and were then shown a frightening presentation about the dangers of herpes; they were then told that their blood sample would be tested for the herpes virus. Herpes can’t be cured, but it can be managed, and there are precautions a person can take to prevent transmitting the virus to sexual partners – so it would be useful to know whether or not you have herpes. Nevertheless, a significant minority, one in five, not only preferred not to know whether they were infected but were willing to pay good money to have their blood sample discarded instead. They told researchers they simply didn’t want to face the anxiety.

Behavioural economists call this ‘the ostrich effect’. For example, when stock markets are falling, people are less likely to log in to check their investment accounts online. That makes no sense. If you use information about share prices to inform your investment strategy, you should be just as keen to get it in bad times as good. If you don’t, there’s little reason to log in at all – so why check your account so frequently when the market is rising?

It is not easy to master our emotions while assessing information that matters to us, not least because our emotions can lead us astray in different directions. […]

We don’t need to become emotionless processors of numerical information – just noticing our emotions and taking them into account may often be enough to improve our judgement. Rather than requiring superhuman control over our emotions, we need simply to develop good habits. Ask yourself: how does this information make me feel? Do I feel vindicated or smug? Anxious, angry or afraid? Am I in denial, scrambling to find a reason to dismiss the claim?

I’ve tried to get better at this myself. A few years ago, I shared a graph on social media which showed a rapid increase in support for same- sex marriage. As it happens, I have strong feelings about the matter and I wanted to share the good news. Pausing just long enough to note that the graph seemed to come from a reputable newspaper, I retweeted it.

The first reply was ‘Tim – have you looked at the axes on that graph?’ My heart sank. Five seconds looking at the graph would have told me that it was inaccurate, with the time scale a mess that distorted the rate of progress. Approval for marriage equality was increasing, as the graph showed, but I should have clipped it for my ‘bad data visualisation’ file rather than eagerly sharing it with the world. My emotions had got the better of me.

I still make that sort of mistake – but less often, I hope.

Also on the Forum Network: Fighting Disinformation: A key pillar of the COVID-19 recovery by Anthony Gooch, Director of the OECD Forum

I’ve certainly become more cautious – and more aware of the behaviour when I see it in others. It was very much in evidence in the early days of the coronavirus epidemic, as helpful- seeming misinformation spread even faster than the virus itself. One viral post – circulating on Facebook and email newsgroups – all-too-confidently explained how to distinguish between Covid-19 and a cold, reassured people that the virus was destroyed by warm weather, and incorrectly advised that iced water was to be avoided, while warm water kills any virus. The post, sometimes attributed to ‘my friend’s uncle’, sometimes to ‘Stanford hospital board’ or some blameless and uninvolved paediatrician, was occasionally accurate but generally speculative and misleading. Yet people – normally sensible people – shared it again and again and again. Why? Because they wanted to help others. They felt confused, they saw apparently useful advice, and they felt impelled to share. That impulse was only human, and it was well-meaning – but it was not wise.

Before I repeat any statistical claim, I first try to take note of how it makes me feel. It’s not a foolproof method against tricking myself, but it’s a habit that does little harm and is sometimes a great deal of help. Our emotions are powerful. We can’t make them vanish, and nor should we want to. But we can, and should, try to notice when they are clouding our judgement.

How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers is available in the UK now from The Bridge Street Press.

The Data Detective: Ten Easy Rules to Make Sense of Statistics will be published in the US on 2nd February 2021 from Riverhead Books.

Related topics

Tackling COVID-19 Post-truth Trust

Whether you agree, disagree or have another point of view, join the Forum Network for free using your email or social media accounts and tell us what's happening where you are. Your comments are what make the network the unique space it is, connecting citizens, experts and policy makers in open and respectful debate.

Tim Harford

Author, Senior Columnist, The Financial Times

Tim Harford is the author of "How To Make The World Add Up" ("The Data Detective" in the US). He is a columnist for the Financial Times, presenter of the BBC Radio show about numbers, "More or Less", and is an honorary fellow of the Royal Statistical Society.

No comments yet.