'A large grain of salt': Why journalists should avoid reporting on most food studies
This is an excerpt from Second Opinion, a weekly roundup of eclectic and under-the-radar health and medical science news emailed to subscribers every Saturday morning.
If you haven't subscribed yet, you can do that by clicking here.
Should journalists just stop writing about food studies and end the noisy battle between headlines claiming that coffee or alcohol or some other common nutrient will save us or kill us?
It's a provocative question, especially considering nutrition research generates some of the most popular click-and-share health stories.
Remember the alcohol study from two weeks ago, with headlines warning that there is no safe level of consumption?
Several "reality check" articles quickly appeared as health journalists explained why the study is essentially meaningless for any individual reader attempting to apply the findings to their own moderate drinking habits.
Another recent study produced headlines that said "cheese and yogurt were found to protect against death from any cause" — a claim that was immediately mocked on Twitter with jokes about using a cheese stick instead of a parachute.
At the University of North Carolina, researcher Noah Haber sent out this finger-wagging tweet:
Spectacularly misleading headline for a spectacularly misleading study. Journals should not have accepted this paper, journalists should not have written about it, and I probably should not have retweeted it. <a href="https://t.co/sEfjJ1HXdD">https://t.co/sEfjJ1HXdD</a>
Haber has a particular interest in those types of headlines because he studies the way scientists and the media deal with causal inference — whether the evidence is strong enough to establish a cause and effect.
In most nutritional research, it is not. And nutrition researchers know this. They are careful to report their findings as being "associated" or "linked" to a specific outcome, whether it's a disease or risk of death, or something positive like longer life.
But news reports often skip the nuance, resulting in headlines like these from just a few months ago: "Coffee key to longer life — study" and "Want to live longer? Science says drink more coffee."
Only a randomized controlled trial (RCT) can come close to establishing that an exposure to something causes a particular outcome. But RCTs — where researchers deliberately expose people to something and compare them to a group of people who were not exposed — are rare in nutrition. They're too expensive, and it's too difficult to study people over a long period of time in real-life eating situations. It's also unethical to expose people to something if the hypothesis suggests it will cause harm.
If you were to gain all the benefit speculated by each one of these studies, we would be able to live for 5,000 years.– John Ioannidis, Stanford University
That's why epidemiologists have developed tools to tease out associations between exposures to a particular substance and health outcomes. When applied to something besides nutrition, such as smoking, or occupational hazards and other pollutants, the tools are more effective.
"Clearly, we can get solid answers that smoking kills people and there's absolutely no doubt about that. We can get pretty solid answers about air pollution," said John Ioannidis, who studies scientific methodology at Stanford University.
"Unfortunately, we cannot get solid answers about common nutrients and common foods with the same epidemiology tools we use in other domains."
That's because in dietary research, there is just too much noise, he said.
Most nutritional studies are based on observational data collected by asking people to remember what they ate and then running statistical analyses looking for links between nutrients and a particular health outcome, such as cancer.
Problems include the vast assortment of confounding factors: Are overweight people also under extra stress? Are red wine drinkers also wealthier? Are people who eat lots of processed food also struggling with lower incomes?
Add to that the differences in age, genetics, sleep rhythms, education levels, access to recreational facilities and community health services and on and on. With 250,000 different foods consumed in endless combinations, the constantly changing circumstances are too complex.
"Individuals consume thousands of chemicals in millions of possible daily combinations," Ioannidis wrote in an article published in JAMA last month.
"Disentangling the potential influence on health outcomes of a single dietary component from these other variables is challenging, if not impossible."
Researchers try to account for the confounding variables, but Ioannidis said they can't eliminate them. Eating is too tightly wound up in other social and behavioural factors that can affect health.
Implausible benefits or risks
For fun, and to make his point about the implausible findings, Ioannidis calculated some supposed life-extending benefits from published research. He concluded, for example, that eating 12 hazelnuts daily would prolong life by 12 years. Drinking three cups of coffee a day would provide an extra 12 years on top of that, and eating a single mandarin orange every day would add five more years.
"If you were to gain all the benefit speculated by each one of these studies, we would be able to live for 5,000 years," he said.
And when it ends up in the headlines, no matter how carefully the language is worded, a causal relationship is still usually implied.
A study that says people who drink coffee also live longer does not mean that people can extend their lives by drinking coffee. (CBC)
"The question is, if it's not causal, why are you reporting it? What's the point?" Ioannidis said. "Once you report it, many people will be misled. People take it more seriously than they should.
"Even if you have capital letters — 'THIS IS NOT CAUSAL' — I'm not sure it will work."
There's an ongoing debate in the research community about all of this. A few years ago, a group of prominent nutrition researchers began campaigning to end the use of food questionnaires in research.
Haber and his colleagues conducted their own study to examine the extent of misleading causal inferences in academic papers and in the resulting media coverage that appeared in social media. They found many examples of weak evidence and overstated language in both the original studies and in the widely shared news stories.
"If you see something that says X is 'linked to' or 'associated with' Y on social media about a health study, you should have a large grain of salt, if not a whole bag of it," Haber said.
He said he's concerned the often conflicting results are eroding the public's confidence in science.
"It's unsurprising that people are questioning the value of scientific enterprise if what they're seeing is weak, conflicting, always flip-flopping sorts of studies."
Ioannidis said any true causal effects that might get lost in all of that noise are probably insignificant. After all, he said, we already know many of the answers to most of the important questions.
"We know we should not eat too much and we should not eat too little. And we should avoid major deficiencies."
So should journalists stop reporting on these types of studies? Ioannidis says yes.
"I think we are doing harm," he said. "We confuse people. We change their priorities."
To read the entire Second Opinion newsletter every Saturday morning, please subscribe.