A bunker-busting academic data bomb has just been dropped on the long-suffering Congolese people after the release of a report by the Human Security Report Project at Simon Fraser University in Vancouver, Canada. The mainstream press fanned the resulting firestorm of academic debate on methodology by misquoting and misinterpreting death toll numbers in headlines that have now virally spread throughout cyberspace. The resulting confusion has dealt another body blow to humanitarian efforts in the Democratic Republic of Congo.
In a phone conversation today, the Human Security Report editor, Dr. Andrew Mack, categorically denied that he stands behind a media-attributed death toll number of 900,000, which is far below accepted estimates of 3 to 7.5 million souls lost. The latter figure is accepted by donor nations and non-governmental organizations when determining aid packages. “What we are really saying in our methodology argument is that no one knows the correct figure,” Mack said.
So it seems the press is largely to blame for the confusion in a quest for sensationalist headlines that used an unresolved academic debate to obfuscate the situation in Congo.
At risk is an accurate depiction of the scope and resolution of the humanitarian crisis in Congo. The International Rescue Committee (IRC) conducted epidemiological surveys between 2000 and 2007, concluding that there were millions of excess deaths attributable to the war in Congo. “Excess deaths” means the number above the deaths directly related to combat and would include mortality by disease and hunger. The 2004 and 2007 surveys by IRC were conducted with Burnet Institute of Australia. The results were widely cited–I and many other journalists use them– are accepted by independent experts, and published in three established medical journals after extensive peer-review.
The Human Security Report Project’s paper, “The Shrinking Costs of War,” criticizes large-scale mortality surveys in general and the IRC’s Congo studies in particular. The IRC understandably has offered a rebuttal of the Simon Fraser Human Security Report.
The methodological limitations they criticize have already been acknowledged by the IRC and widely discussed in the process of scientific peer review and at academic conferences, with broad agreement that they do not invalidate our findings. There is, in fact, little that is new in the authors’ observations, and overall, their arguments are undermined by inconsistencies, conflicting evidence and poor scholarship.
… The IRC and Burnet used sound scientific methods for estimating the previously unknown cost of war in Congo, showing that millions, rather than thousands, had died as a result of the war and its aftermath. We believe this information is valid and that it has been and continues to be of essential value to public health and political decision-makers.
The two disputed issues are the baseline mortality rates and selection of representative areas for sampling used by the IRC. The Human Security report suggests that IRC picked a baseline mortality number for Congo that was too low, thereby inflating conflict-related deaths. This is basically a statistical and methodology-related debate and Dr. Mack conceded that there are “no estimates that are precise.” Mack explained that the 900,000 number was determined by substituting a higher death baseline into the “low” figure of 2.83 million dead in the last IRC Report. “We do not deny the veracity of the IRC report,” Mack said.
Mack’s main contention, and this is echoed in the Simon Fraser Report, is that “You cannot use population surveys to compute excess deaths.” Whether it is better to rely upon “passive data” collected by Uppsala University on “battle deaths,” is not something that should be defended or debunked in the media by non-statisticians such as myself.
However, it does raise my eyebrows and create new furrows to learn that the data is collected from afar and on an annual basis, so that information is related to activities during one calendar year. “This is so even if conflicts may start and/or end at dates that do not fit this pattern. Still, the emphasis is on the year as the basis for comparison and computation,” says the Uppsala website.
Accurate? I don’t know.
It is also noteworthy, and Dr. Mack readily offered this information, that the main financial supportfor the Human Security Research Project at Uppsala University comes from Simon Fraser University, and the DFID (United Kingdom Department for International Development).
Should Uppsala feed data to the think tank that funds it?
The IRC raised the issue of media misrepresentations and unreliable data in its rebuttal of the current press coverage of the Simon Fraser Report.
Indeed, an unsourced New York Times estimate of 100,000 deaths in Congo in 2000 was among the reasons the IRC conducted its first survey. That study’s findings revealed to the world the full scale of Congo’s humanitarian crisis, leading to major changes in humanitarian policy and international political engagement. A review of mortality surveys in Darfur, Sudan, by the US Government Accountability Office was used to counter inflated mortality estimates that were being promulgated by various groups. Similarly, a Kosovo-wide survey in 1999 led to an authoritative estimate of war deaths that helped respond to claims of higher death tolls. The record shows that excess mortality estimates derived from retrospective surveys can be invaluable for guiding humanitarian programming and policy and for responding to the politicization of human suffering.
Image: Displaced Persons Camp on outskirts of Goma
It is far beyond unfortunate that this academic debate stands to produce a possible humanitarian aid backlash for the Congolese people.
In addition, the authors of the Simon Fraser Report were never in-country and used a completely different methodology from the one used by the IRC, which conducted in-country surveys. This is an academic debate of apples and oranges if one considers the tenets of the scientific method, a procedure that requires a hypothesis that must be proven by consistent replication of data by the same methodology before one can even establish a theory.
This debate should not be conducted in the press, and it is highly unfortunate that the headlined 900,000 number may become the new “fact,” because of an academic paper whose authors readily admit that they “do not know” the real numbers.
This is certainly a confusing and unresolved debate, but there are some other facts that should be considered.
Maurice Carney of Friends of the Congo, an advocacy group based in Washington DC, says that both studies have omissions, extrapolations, and perhaps political motivations.
Both studies omit the 1996 and 1997 period when hundred of thousands may have been slaughtered according to the Garreton Report.
The IRC report ended in Spring of 2007 so no estimates have been made since that time, which is a total of 5 years that is undocumented
The interesting thing about the extrapolations made by the IRC is that the Frasier Institute questions could very well be underestimates and not overestimates.
In the final analysis no one is denying that millions have died whether it is 3 or 6 million after a certain point, one thing that is not in question is the gross nature of the conflict that continues to this day. The IRC is on solid ground except for its timeline (1998 – 2007), which we believe is political in nature.
Dr. Les Roberts, a Columbia University professor who helped write the Congo study for the IRC, wrote the “report draws unjustified conclusions and will leave the world more ignorant and misguided for its release,” in a letter to Dr. Mack.
A particularly troubling fact about the Simon Fraser report is that it does not list authors, only contributors. I asked Dr. Mack about this and he would only say that he was an editor. When asked if an economist named Michael Spagat contributed, Mack said that Spagat was a “technical advisor.”
This is important information, since Spagat worked with a project in England called Iraq Body Count which monitored deaths in Iraq, by “passive surveillance” methods, such as documenting news reports of casualties.
Spagat skewered the 2006 Lancet Report on an estimated 650,000 civilian casualties of the Iraq war and received more media coverage than the peer-reviewed Lancet Article. I remember this well, since I was working on the Coleen Rowley for Congress campaign in Minnesota at the time, and the Lancet Article and subsequent media criticism were huge topics of debate.
Now, “passive surveillance” systems of data collection has entered the media lexicon. The bottom line, according to experts, is that surveys tend to get higher death rates.
So that is the academic debate.
I will admit upfront that I am not sure I have been able to describe this debate without my own bias interfering. Having spent considerable time in Congo, experiencing detention and imprisonment, I have a deep affinity for the struggles of the Congolese. I do not want to see their suffering diminished by academic haggling.
I struggle over my own reports that are written from the safety of the United States and look forward to the next time I can join the Congolese and report first-hand their struggle for the simplest vestiges of humanity. Bias is inevitable when you witness the complete breakdown of all that one holds holy, but academic debates should not become headlines.
One might also ask the question, which is the elephant in the room, as to whether the criticism of the IRC Reports put forth by the Human Security Report Project is a veiled way of fueling an academic feud which began with the Lancet estimate of high civilian casualties in Iraq.
Guesswork, when human life and complete societal breakdown is at stake, cannot be tolerated.