The importance of getting the numbers right
Anyone reading this site for more than a short while will probably have realized that one my major pet peeves is the use of inaccurate figures by alarmists of various persuasions in an attempt to divert more resources to addressing their favourite causes rather than directing resources to each cause in proportion to its effect on people and on society. Where that came to mind again recently was in an article subtitled More evidence that routine mammograms make healthy people sick. It addresses, in part, how despite heart disease killing more women than all cancers combined it's breast cancer that draws disproportionate attention (and funding):
What’s the No. 1 killer of women? It’s a question that practitioners asked every new patient at a clinic where physician Lisa Rosenbaum once worked, and she hasn’t forgotten the answer given to her by one middle-aged woman with high blood pressure and elevated blood lipids. “I know the right answer is heart disease,” the patient told Rosenbaum, “But I’m still going to say ‘breast cancer.’ ”
... The Rosenbaum commentary explores a phenomenon that Cass Sunstein dubbed “misfearing”—our human nature to fear instinctively, rather than factually. Rosenbaum’s patient’s first answer is correct—heart disease kills more women than all cancers combined, yet breast cancer seems to invoke far more fear among most women. ... Studies show that women—and doctors—grossly overestimate their risk of developing breast cancer and dying from it. ... I have to think that the media is partly to blame.
... Mammography proponents like Harvard’s Daniel Kopans are surely right that mammography has saved some lives. But the more important question is whether they’ve helped more women than they’ve harmed, and the evidence is now clearly pointing to no.
It's interesting to compare the average funding for research into addressing each, in part a result of the attention given to each: Breast cancer received $19,342 in research funding per death, whereas cardiovascular disease received a comparatively meager $2,659 per death.. At what point should you begin to view those using inaccurate figures - figures that they should know are inaccurate if they'd investigated them - not as positive forces in society but as responsible for significant deaths?
Consider the death toll possible from shoddy research - just recently an article came out suggesting that research misconduct may have lead to a significant number of deaths in Europe in recent years by causing poor treatment protocols to be recommended. The headline there suggests that the improper treatment may have resulted in causing 800,000 deaths over 5 years in Europe, but, particular given the subject of this article, I should note that that 800,000 figure also seems dubious. To quote update #2 to the Forbes piece by those conducting the underlying research:
Our article is a narrative of events with a timeline figure and a context figure. We had not considered it to contain scientific statements, but we admit that it does multiply together three published numbers.
... Where our article relayed numbers, we made clear that alternative values were possible. The focus for readers was on how serious the consequences can be when clinical research goes wrong.