How not to be taken for a mug by misleading health stories this New Year

News & Analysis

31st December 2014

The Christmas/New Year period is always fun for health balls. Because we like drinking lots of wine and eating lots of chocolate around this time of year, newspapers like to pick up on weird little studies which purport to show that those things are good for us, while leaving out inconvenient details, stuff like ‘the study was on some tissue samples in a petri dish’ or ‘the study was on a chemical which exists in wine in trace amounts but we’re pretending it’s about wine in general’ or ‘obviously chocolate isn’t good for you, for God’s sake’. So here are some hints and tips to avoid being taken for a mug.

1) Read the whole story

The master of health-balls debunking, Ben Goldacre, first called attention to the sneaky journalistic trick known as the ‘caveat in paragraph 19’. You write your piece with its big exciting headline (‘Christmas crackers cause haemorrhoids’). You push your exciting reasons for it (‘A new study says that the action of cracker-pulling can lead to herniation of blood vessels around the anus’). And then, miles down your hugely long piece, you add the caveat (‘the study’s authors say that this study looked at the effects of fish oil on dementia and had no relevance whatsoever to crackers and haemorrhoids’) which renders your sensationalist headline utterly ridiculous, but which leaves your story technically ‘true’ even though no one will read that bit.

Example: the piece in the Daily Mail on Christmas Eve which asked (in its headline on the website’s home page) ‘Why does white wine send women crazy?’ ‘What is it about the drink of choice for so many women that sends them doolally – or ‘psychotic’, as one friend confessed?’, the article asked. ‘Is there something in the wine itself?’

After lots of anecdotes of people who made fools of themselves drinking white wine, and several paragraphs of generic stuff about how we drink bigger drinks than we used to, we get to the caveat in paragraph 25. ‘While there appears to be no published research – or evidence – that white wine per se is the culprit for so many women becoming so intoxicated, theories abound.’ So. The answer to the question ‘Why does white wine send women crazy?’ is simply ‘it doesn’t, particularly’.

2) Look out for bait-and-switch

‘Red wine prevents cancer!’ roars a headline every two days or so. This time I can point you to the Express: ‘Drinking a moderate amount of red wine could help beat cancer, new research suggests’.

You’d be forgiven for thinking that the study was into the effects of drinking red wine on the likelihood of getting or beating cancer. But in fact it’s a study into the effects of resveratrol, a chemical found in grape skins (and therefore red wine), on cancer. And, yes, it seems to reduce the likelihood of certain cancers. But since alcohol causes cancer, drinking red wine doesn’t make you less likely to get cancer: it just means that drinking red wine causes cancer somewhat less readily than drinking other kinds of alcohol. The author of the study himself says ‘The more you drink, the more you accumulate DNA damage, and the more chance that one or more cells will accumulate the specific type of DNA damage that can cause cancer… Resveratrol takes out the cells with the most damage.’

This is a classic bait-and-switch: pretend your piece is about something your readers know and care about (the effect of drinking red wine on cancer) and hide the fact that it’s really about something entirely different and less interesting (the effect of a chemical they’ve never heard of on cancer). And hope they don’t notice.

3) What have they tested it on?

What you’re looking for in the piece is something saying that this amazing new life-saving thing was tested on humans in a randomised double-blind controlled trial. Lots and lots of humans. Not mice, not fruit flies, not disembodied gut tissue in a vat. And most of all, not one person who reckons it’s done them good.

Especially if that person is selling a book about it, as is the case with the ‘alkaline diet’, being pushed by one Natasha Corrett in the Telegraph. She says it’s what helped her lose two and a half stone and cured the symptoms of her polycystic ovary syndrome. So she’s written a book of recipes.

But – and here’s the thing – lots of people lose weight without having a quacky diet recommended by their Ayurvedic acupuncturist. (Another tip: anything with the word ‘Ayurvedic’ near it is probably nonsense.) And sometimes the symptoms of an illness improve. ‘I got better, therefore this thing that I did around the time that I got better must have caused it’ is not good evidence.

4) If it’s interesting, it’s probably false

This is the sad thing. If you read any science story based on research which purports to overthrow the consensus view on any topic, it’s probably not true. Even if it’s been done on a relatively large sample of people. Even if it has a ‘P-value’ – the figure which purports to show how solid the statistics behind it are – below the arbitrary cut-off point for significance of 0.05, meaning that there ought to be a less than 1 in 20 chance that the result is a fluke. If it’s big and newsworthy and exciting, be very wary of it.

It’s obvious when you think about it. Most research will confirm stuff we already know, for the simple reason that if we were constantly overturning everything we know, we wouldn’t actually know anything. So if one study says ‘X’ when hundreds of others say ‘not X’, it’s much more likely that the first study went wrong somehow than that everything we thought we knew is false. I won’t give an example here, because it is basically ‘all of science and health reporting’, but it’s worth keeping an eye out for.