The battlefield for scientists fighting over how to best estimate war-related deaths has moved from Iraq to the Congo. Yesterday, a new report looking at the overall picture of wartime mortality offered new estimates of the number of people in the Democratic Republic of the Congo who had died due to the fighting there since 1998. The numbers are dramatically lower than the widely quoted 5.4 million figure issued by the International Rescue Committee (IRC). The new analysis, part of the Human Security Report 2009, funded by several countries and periodically released by Simon Fraser University in British Columbia, Canada, faults the methods used by IRC, challenging in particular the use of retrospective surveys to determine wartime mortality rates. IRC vigorously defends its work, however, and the dispute has clearly revived a battle among academics who previously tussled over counting war-related deaths in Iraq.
The report, The Shrinking Cost of War, broadly contests what it calls the "commonsense" assumption that mortality rates increase in countries during times of war. In recent decades, the changing nature of military conflicts around the world, the dramatic improvements in peacetime public health that limit deaths due to illness during wars, and the growing effectiveness of humanitarian assistance during war have combined so that "nationwide mortality rates actually fall during most wars," the report concludes.
But drawing the most attention and debate in the new report is a chapter that deals with the deaths related to the multiple conflicts that have enveloped the Congo over the past few decades. The report, for example, says: "... We demonstrate that the IRC's 5.4 million estimate is far too high. We further argue that estimating excess war deaths--which include those from war-exacerbated disease and malnutrition, as well as war-related injuries--is a task so fraught with challenges that it can rarely succeed."
The report criticizes IRC's past work on two main fronts. It argues that IRC didn't conduct its surveys in the appropriate parts of the country and that IRC adopted an overly low baseline mortality rate for the Congo, which created the appearance of more war-related deaths. The report says:
The results of the IRC's first two surveys, which covered a period between August 1998 and March 2001, were restricted to the violence-wracked eastern part of the country. They indicated that the war had generated approximately 2.5 million excess deaths.
But, the IRC's researchers did not select the areas to be surveyed in a way that ensured they were representative of the region as a whole. This failure to follow standard survey practice means no confidence can be placed in any excess mortality estimates from this period--although no one doubts the death tolls in parts of the region were very high.
But, even if this critical misstep is ignored, other methodological errors, including reliance on the too-low baseline mortality rate, led to large and unwarranted inflations of the excess death estimates. For example, when the Human Security Report Project's research team corrected for a series of erroneous assumptions in one of the IRC's calculations for the period covered by the first survey, the excess death toll fell from 1.6 million to just 678,600--a decline of almost 60 percent.
The excess death estimates for the final three surveys, the only ones to cover the entire country, were not affected by the methodological errors evident in the first two surveys. Here, the major problem, as mentioned above, lay with the inappropriately low baseline mortality rate. The impact of changing this rate to a more appropriate one was dramatic. The estimated excess death toll dropped from 2.8 million to less than 900,000. This is still a huge toll, but it is less than one-third of the IRC's original estimate for the period.
Andrew Mack, director of the Human Security Report Project, notes that he and his colleagues initially accepted IRC's results but slowly came to believe there were significant problems with them. "Retrospective mortality surveys appeared to provide a logical answer to the [war-related deaths] measurement problem--we wrote admiringly about the IRC's work in the Congo in our 2005 report. But the more we looked the more insuperable the challenges appeared," he says.
IRC and the Burnet Institute, which co-led and authored the final two surveys, last night released a joint response (IRC Burnet Response - HSR.docx) to the Human Security Report's critique, saying:
The methodological limitations they criticize have already been acknowledged by the IRC and widely discussed in the process of scientific peer review and at academic conferences, with broad agreement that they do not invalidate our findings. There is, in fact, little that is new in the authors' observations, and overall, their arguments are undermined by inconsistencies, conflicting evidence and poor scholarship.
... The significant challenges of conducting surveys in conflict zones were widely acknowledged and reported in the surveys. But even before their release, the [IRC] reports were reviewed by scientists from the Centers for Disease Control and Prevention, Harvard, and Columbia University to ensure academic rigor. In spite of the limitations, our conclusions were widely accepted as valid by independent experts.
IRC and Burnet maintain that their surveys, "based on standard and scientifically-grounded methodology, helped reveal the true scale of suffering in one of Africa's largest countries. The results were widely cited, accepted by independent experts and published in three established medical journals after extensive peer-review. The findings remain the best estimates available of conflict-related mortality in Congo."
Les Roberts of Columbia University, who participated in the last Human Security Report and who was involved in some of the IRC's Congo surveys, also defended the IRC's results in a strongly-worded letter to Andrew Mack that called the new report "unscholary" and something whose "unjustified conclusions ... will leave the world more ignorant." (full letter below, and appendicies to Les Roberts's letter to Andrew Mack.doc)
I was sorry to see the Human Security Report (HSR) released today. I was sorry because this report draws unjustified conclusions and will leave the world more ignorant and misguided for its release. There are four very weak aspects of this report that led to this opening line which I find most problematic, "this report reveals that nationwide mortality rates actually fall during most wars":
1) This and many other conclusions are solely a function of the low threshold chosen to define "war," considering it to be ongoing with just 25 killings per year. If war was instead defined as occurring in a population where 0.1% was violently killed in a year, I strongly suspect almost all of the HSR conclusions would reverse. This definition would be closer to the public image of war and to where humanitarian aid dollars flow.
2) The report is rife with profound inconsistencies of logic. Moreover, the report completely contradicts a main theme of the last Human Security Report ("War-related diseases kill and disable far more people than bombs and bullets"). The conclusions about giving up on surveys to directly measure war-time excess deaths contradicts the conclusion from the meeting you hosted in March, 2004 with a collection of highly regarded experts including: Jennifer Leaning, Debbi Sapir, and Richard Garfield. Some of the more egregious internal inconsistencies are listed in Appendix B.
3) The report is unscholarly, not fully exploring sources, citing one source for one point and ignoring that source elsewhere. It was particularly selective to cite Chris Murray's 2002 BMJ article as "much-cited" but not the follow-up 2008 BMJ article with Ziad Obermeyer which shows that the PIRO dataset on which the HSR is largely based, misses most deaths. A list of serious inconsistencies or errors is included as Appendix C.
4) The HSR claims war does not stop the usual mortality decline seen in most poor nations, but then does not study or report on those people affected at the times of war. The report looks at entire nations where you admit a tiny fraction of people are affected for a tiny fraction of the study period and draw conclusions with data so crude and general as to be meaningless. The report uses national, time-smoothed data...without the appropriate confidence intervals...to detect the effects of armed conflict. For those of us who were in Rwanda in 1994 and saw those thousands of child bodies dumped in the mass graves, the idea presented in the HSR that 1995 and 1996 were less healthy for children than 1994 is incomprehensible.
As a contributor to the last Human Security Report, I was sorry to see this report. As one of the main forces of accountability in humanitarian assistance, the Canadian Government should be mortified by this report and its, perhaps inadvertent, assault on SMART and relief accountability. As a scientist, I am disheartened to see all this money spent on the HSR to make the academic community more fractious. Many years ago we went out and attempted to report to the world about an unfolding crisis in the Congo. We did it carefully, but as we described at the time, crudely, at great risk to life and limb, and at only a few percent of the cost of this Human Security Report. It is unbecoming to grab a headline a decade after by tearing down a study with erroneous speculation. If you want to advance this field, there are apparently under-reported crises underway in Somalia and Northern CAR; go there and do better.
Both sides on this academic conflict are now lining up supporters for their analyses--and offering those names to the media. It's clear that this is one skirmish that is far from a peaceful resolution.