Thursday, 6 March 2014

When you measure social issues, they get worse

For reasons some people know locally, I'm only too aware of just how much domestic violence there is, in the UK and beyond.  It is insidious, destructive and much more common than most people think.

But yesterday's mega-survey of the EU, which found that the UK had some of the worst levels of domestic violence (the 5th worst, in fact), has set me thinking about the peculiar effects of counting, and what happens when you take statistics too seriously.

I'm aware that these are not official figures.  They are based on a survey, but the same peculiar effect applies, and I wrote about it in my book The Tyranny of Numbers.

In quantum physics, the mere presence of the observer in sub-atomic particle experiments can change the results. In anthropology, researchers have to report on their own cultural reactions as a way of offsetting the same effect.  Perhaps that is some clue to the peculiar way in which statistics tend to get worse when society is worried about something.

Why, for example, did the illegitimacy figures shoot up only after the war babies panic in 1915? At the time of the panic, the number of illegitimate births was actually astonishingly low – and the number of marriages strangely high. After the panic, the illegitimacy rate suddenly increased.

Why was the number of homes unfit for human habitation in the UK in 1967 (after TV film Cathy Come Home) were almost twice the figures for 1956 – despite over a decade of intensive demolition and rebuilding?

The garrotting scare in the 1860s was the same. The story began during the silly season in August 1862, and public horror got so bad that Punch was advertising a range of fearsome neck-guards with metal spikes to protect your neck. But the increase in the crime statistics came immediately afterwards, once the Garotters Act had brought back flogging for adults.

The tragic death of Stephen Lawrence in a racist attack led to widespread concern about race attacks in London. But after the public inquiry on the subject in 1998, Metropolitan Police figures of race attacks leapt from 1,149 to 7,790 in one year.

It was the same with the sex abuse statistics. They toddled along in the UK at the 1,500-a-year mark until 1984, when an unprecedented wave of publicity on both sides of the Atlantic catapulted the issue to the top of the public agenda. 

Between 1984 and 1985, the NSPCC reported a 90 per cent increase in reported cases, and in the following year they reported a similar rise. 

Child abuse campaigners would say that the actual rate of child abuse is never reflected properly in the statistics. They may well be right – the same would be true of the figures for racist attacks. All I'm saying is that the actual statistics wouldn’t have told you anything, except how strongly the public felt about it at the time. 

So often, the statistics start rising after the panic, rather than the other way round, as an eagle-eyed society tries to stamp out the unforgivable. That’s the Quantum Effect of statistics.

It’s difficult to know quite why the figures go up. Sometimes the definitions change to reflect greater public concern. Sometimes people just report more instances of it because it is in the forefront of their minds. Sometimes, maybe, what we fear the most comes to pass.

You may be seeing something of the same Quantum Effect in the story about domestic violence.  The figures are high, both official figures and survey results, when people notice domestic violence.  It is high when they define incidents as domestic violence.

On the other hand, the figures are low when society is blind to domestic violence, when they turn a blind eye to it, when they define it as just one of those things.  The statistics may tell the absolute reverse of the truth.

This may be an explanation for why the domestic violence figures seem worse in the liberal Scandinavian countries, where people are very aware of domestic violence and worried about it.

None of this suggests that domestic violence isn't an appalling cancer.  But it does suggest that the figures won't give you a very good guide to it - because they tend to get worse the more worried we are about the issues.

It is a strange phenomenon, and it gets stranger the more we trust the data: the truth is that - the more worried we are about something - the worse the figures get.

Subscribe to this blog on email; send me a message with the word subscribe to dcboyle@gmail.com. When you want to stop, you can email me the word unsubscribe.



No comments:

Post a Comment