Health journalism is often full of exaggerated, conflicting, or outright misleading claims. If you ever want to see a perfect example of this, check out “Kill or Cure,” a site where Paul Battley meticulously documents all the times the Daily Mail reported various items — from antacids to yogurt — either cause cancer, prevent cancer, or sometimes do both.
And it’s not just this one newspaper. You’ve no doubt stumbled upon the many, many over-hyped media reports — online, on TV — that suggest that coffee prevents Alzheimer’s disease, that a new drug might end MS, or that red wine is the elixir for a long life. Not surprisingly, many of these claims have turned out to be misleading or wrong.
So who’s to blame for all these bad stories and the sorry state of health journalism? One new study, published in the British Medical Journal, assigns a large fraction of blame to the press shops at various research universities. The study found that releases from these offices often overhype the findings of their scientists — while journalists play along uncritically, parroting whatever showed up in their inbox that day. Hype, they suggest, was manufactured in the ivory tower, not the newsroom.
Overhyped health stories are a huge problem
Misleading health news stories are surprisingly widespread. One survey by Gary Schwitzer, who runs the watchdog website HealthNewsReview, looked at 500 health stories published in large newspapers over two years. Many of the stories were awfully flawed, he found, overplaying the benefits and underplaying the harms of various treatments, exaggerating the prevalence of diseases, and leaving out discussion of alternative options.
And these stories are a real health hazard. Lots of people make decisions every day based on the things they read in the media. A number of studies have documented the ways media coverage influences people’s health choices, from whether to get screened for breast cancer or go in for a colonoscopy.
Ask any physician about the number of patients who have come into their offices, requesting some test or treatment based on something they saw in the media. I’ve heard from many of them. I’ve also heard from health ministers who say they rely solely on journalism to inform themselves about the latest science. I don’t need to tell you again about the alarming hold a certain TV doctor has on his audience — no matter how dubious his advice.
So who’s to blame for overhyped health journalism?
In this latest British Medical Journal study, the authors looked at 462 press releases about human health studies that came from 20 leading UK research universities in 2011. They then compared these press releases both to the actual studies and the resulting news coverage.
What they wanted to find out was how overblown claims got made. Take, for example, the notion that coffee can prevent cancer. Did that come from the study itself, the press release, or was it a figment of the journalist’s imagination?
The researchers found that university press offices were a major source of overhype: over one-third of press releases contained either exaggerated claims of causation (when the study itself only suggested correlation), unwarranted implications about animal studies for people, or unfounded health advice.
These exaggerated claims then seeped into news coverage. When a press release included actual health advice, 58 percent of the related news articles would do so too (even if the actual study did no such thing). When a press release confused correlation with causation, 81 percent of related news articles would. And when press releases made unwarranted inferences about animal studies, 86 percent of the journalistic coverage did, too.
On the flip side, when press releases were free from exaggeration, the press was much less likely to exaggerate in those those three areas (the rate of overhype declined to 17 percent, 18 percent, and 10 percent, respectively).
The scientists were usually present during the spinning process, the researchers wrote: “Most press releases issued by universities are drafted in dialogue between scientists and press officers and are not released without the approval of scientists and thus most of the responsibility for exaggeration must lie with the scientific authors.”
Other studies have also blamed press offices
The findings in this latest BMJ paper should not be a surprise to any health reporter or observer of health news.
A number of previous studies have come to nearly the exact same conclusions: that when academic press offices over-sell, mislead, or otherwise exaggerate the results of research, that very same framing trickles down to the journalists and bloggers covering science.
In 2002, researchers at Dartmouth did interviews with press officers at nine of the big medical journals. They found that none of these journals had any standards for acknowledging the limitations of a given study in the press release.
In 2009, the same researchers looked at a random selection of press releases from ten medical centers in the United States. They concluded, “Press releases from academic medical centers often promote research that has uncertain relevance to human health and do not provide key facts or acknowledge important limitations.”
In another 2012 study, also published in the BMJ, those researchers examined press releases from major medical journals and compared them with the newspaper articles generated. They found a direct link between the scientific rigor in the press release and rigor in the related news stories. “High quality press releases issued by medical journals seem to make the quality of associated newspaper stories better,” they wrote, “whereas low quality press releases might make them worse.”
There’s a lot of other research documenting the same phenomenon — and the disturbing fact that bad health reporting can influence people’s health decision-making, from patients to doctors and policymakers.
How much blame do journalists bear?
Last year, I met the Dartmouth researchers doing work on press releases — Steven Woloshin and Lisa Schwartz. They told a group of journalists at MIT to be extremely careful in reporting on health, that everyone has an interest in exaggeration: from the pharmaceutical and device industry, to academic institutions, and the sponsors of big scientific meetings. “You have a tough job,” they said.
Of course, in an ideal world, journalists would be more skeptical. We would take time to carefully read every study we report on, and we would be armed with enough knowledge to be critical of the claims being made — in both the media releases and the original research. We would investigate every health claim before disseminating it to our large and vulnerable audiences.
We should do better. In fact, the longer I report on health, and see the impact of my own work, the more I believe that I need to think about the words I write in the same way a doctor thinks when he or she writes a prescription.
Unfortunately, however, this isn’t a perfect world. Many journalists are often covering science in the morning, and the courts in the afternoon. We are under heavy pressure to meet multiple deadlines every day, and some of us lack the time, sources, or background to properly vet the studies we’re reporting on.
So we rely on scientists and on press offices to guide us through research, even though, clearly, we shouldn’t.
I don’t want to let journalists off easy. But journalists aren’t going to have the last word on scientific standards. The onus should be on academic institutions to do better. The authors of the BMJ piece also see their latest bit of damning evidence as a possible rallying cry for academics.