Counterproductive antiwar arguments
Antiwar arguments which are based primarily on promoting the highest death estimates from war, while dismissing all lower figures as pro-war propaganda, tend to be counterproductive, and serve to alienate journalists who may otherwise be sympathetic to the antiwar case.
By Robert Shone on Monday, September 20th, 2010 - 1,377 words.
A popular, but counterproductive, “antiwar” argument works roughly as follows:
1. Promote the highest death count (eg a million Iraqi deaths).
2. Reject lower figures as pro-war propaganda.
3. Cite evidence which supports this position.
4. Ignore evidence which refutes this position.
Of course, it’s not really an “argument” in itself, and it’s counterproductive for at least two reasons. First, even if you can cite supporting evidence, your “argument” is undermined if the scientific evidence which refutes it starts to look substantial and comes from authorities in the field.
Second, it reinforces pro-war framing (inadvertently of course). The notion that “low” death counts (eg “only” 100,000) are war “propaganda” makes sense only from a simplistic “algebra of death” frame in which human lives can be traded for others like casino chips. Without this frame a “low” death count is simply a research finding, not “propaganda”.
(Ironically, “excess” death estimates from epidemiological surveys can show a “net benefit” from war in terms of this death-algebra. An Afghanistan study by Gilbert Burnham found lower infant and child mortality after the invasion, due to improved medical care – implying that lives saved exceeded deaths from the fighting. Was this “pro-war propaganda”? Burnham also co-authored the 2006 Lancet Iraq study).
Another weakness with the above “antiwar” “argument” (the one which insists on the highest available death estimate) is that it typically relies on a highly selective form of credentialism, which has a tendency to backfire. This has been the case in a striking way with the Iraq conflict, where the peer-reviewed studies refuting the highest estimates now far outweigh the supporting studies (as I documented in a previous Comment Factory piece – see reference section).
Not only that, but several experts (and commentators) who were initially cited in support of the highest mortality estimates have apparently changed their minds as a result of the publication of new studies and new (critical) information on previous studies. For example, Paul Spiegel, an epidemiologist at the UN, commented on the later Iraq Family Health Survey (which estimated 151,000 violent deaths over the same period as the Lancet 2006 study which estimated 601,000):
“Overall, this [IFHS] is a very good study” [...] “What they have done that other studies have not is try to compensate for the inaccuracies and difficulties of these surveys” [...] “this does seem more believable to me [than Lancet 2006]” (Paul Spiegel, Washington Post, 10/1/08)
First, I want to be clear that I have no interest in defending the Burnham et al. [Lancet 2006] estimates. The flaws in that study are now well known. (Patrick Ball, 28/4/2010)
Stephen Soldz, a psychoanalyst and antiwar activist, was a vocal advocate of the Lancet Iraq studies. He appeared to be one of the better-informed commentators, and had spoken directly with some of the authors of the main mortality studies. Then, in March 2009, he wrote an article for ZNet which expressed a change of mind over the Lancet 2006 study:
If one major methodological detail was distorted, we simply cannot know whether other aspects of the study were carried out as stated. Until and unless there is far greater detail on these methods, I do not feel that their estimate of 650,000 post-invasion surplus deaths can be trusted. (Stephen Soldz, ZNet, 16/3/09)
I’ve previously noted that Richard Kulka, President of the American Association for Public Opinion Research (AAPOR), criticised the authors of the Lancet 2006 study for not answering “even basic questions about how their research was conducted”. It seems that Soldz had similar concerns with unanswered questions on the study’s sampling methods, etc. Soldz wrote that, “As long as these questions remain, the study cannot be considered reliable”.
For those with an emotional investment in the above “antiwar” “argument” (the one which insists on the highest available death estimate), Soldz’s article was nothing less than blasphemy and betrayal. He was immediately, and viciously, attacked – on the Media Lens website, for example. David Edwards (a Media Lens editor) wrote that Soldz’s mind was “showing the signs of what might be called propaganda weathering and erosion”. One of Media Lens’s followers, Gabriele Zamparini, wrote that Soldz was providing “propaganda for the mass murderers”.
At this point Soldz must have been feeling the same way that IBC probably felt when accused by Media Lens of “providing powerful propaganda for people responsible for horrendous war crimes”. Soldz, who was previously a supporter of Media Lens, replied in uncharacteristic fashion:
I used to think that we needed a group like Media Lens in the US. Now I am simply thankful that this particular flavor of Stalinism hasn’t yet emigrated. The society you fight for would end up, despite the nice words, as being one of the jackboot stomping on the human face forever. (Stephen Soldz, 18/3/09)
This episode provides a good example of how the above “antiwar” “argument” (the one which insists on the highest available death estimate) divides people who share the abhorrence of suffering caused by war. I’ve seen many of these counterproductive disputes, and I think it draws attention away from core antiwar arguments which do not depend on accepting a given death toll. It also alienates journalists who may be sympathetic to antiwar viewpoints, but who are criticised for not sufficiently promoting the “correct” (ie highest) estimates.
One rational response to perceived media “suppression” of death tolls is to demand more information on the whole range of studies, rather than to lobby for greater emphasis on one’s favoured estimate. It’s counterproductive to insist that the media is systematically suppressing the higher estimates, when the evidence indicates that the 2006 Lancet study received far more coverage than those which provided much lower estimates (ie IFHS, ILCS – the two largest Iraq surveys, and arguably the ones with the best quality control).
The Lancet 2006 estimate was given headline coverage on BBC1 News and BBC2 Newsnight on the day of its publication. IFHS wasn’t mentioned at all by the main BBC programmes. Few people have even heard of ILCS (a massive 2004 study which dwarfs the two Lancet studies) – a reflection of the amount of media coverage it received. The ORB poll (estimating over a million deaths) was mentioned on BBC2 Newsnight, and seems to have received more media coverage than IFHS and ILCS, even though it wasn’t peer-reviewed science (and was conducted by someone who began his polling career in 2003, with little in the way of formal training or field experience – according to ORB’s publicity literature).
Iraq Body Count’s figures are often cited (although I’ve never seen IBC given headline TV coverage in its own right) for at least two good reasons: it’s an ongoing, updated tally (unlike one-off survey estimates), and it provides a database of actual, documented deaths which, although incomplete, has not been seriously challenged in the scientific literature (unlike the survey estimates based on statistical projections). Some antiwar campaigners overlook these reasons, and assume that the use of IBC’s figures proves that the “corporate media” is “playing down” the scale of destruction. This might appear to be true in some cases (some journalists/editors supported the war, after all), but the notion of “playing down” makes sense only in terms of the death-algebra frame, and only if there’s a “known”, “actual” figure with which to compare the “played down” figure. If there is a systematic “playing down” in the “corporate media”, one wonders why the Lancet 2006 estimate has received so much more coverage than IFHS’s figure, which was lower by 450,000 violent deaths for the same period surveyed.
It doesn’t take much to simply acknowledge that there is, as yet, no scientific consensus over the number of Iraqi deaths, and to give journalists (who may be sympathetic to the antiwar cause) the benefit of the doubt – rather than assume that they must be unwitting agents of Establishment Power since they don’t have absolute faith in the highest estimates. Clearly it is important to quantify deaths from war, and this has a role in ending or preventing conflicts. But it’s not about promoting a case against (or for) a war in terms which reduce human lives to mere units in some unverified ballpark figure.
Articles by this author
How not to do media analysis
A comparison of two different approaches to media analysis, using the example of media coverage of the Iraq invasion. The analytical, empirical approach of a recent academic study is contrasted …
Counterproductive antiwar arguments
Antiwar arguments which are based primarily on promoting the highest death estimates from war, while dismissing all lower figures as pro-war propaganda, tend to be counterproductive, and serve to alienate …
Why is the Chilcot Iraq Inquiry ignoring the spilt blood?
Iraq Body Count (IBC) has successfully drawn media attention to the failure of the Chilcot Iraq Inquiry to take account of Iraqi casualties. The media coverage has, for once, been …
How to deconstruct our lying media
Media criticism comes in many forms - the approaches of meta-journalism, propaganda model and frame semantics work well together, but the semantics method is relatively new and urgently needs investing …
Dubious polls: How accurate are Iraq’s death counts?
The 2006 Lancet study on Iraqi deaths has received a 2010 "Top Ten Dubious Polling" award, following criticism of the study's lead author by the American Association for Public Opinion …
The logic of collective punishment
Is all collective punishment unjust? There are logical arguments for and against punishing people collectively in the cases of the BP oil-spill, the Israeli attack on the Gaza-bound convoy, etc. …
“Passive” death-counts in Iraq have merit
A media-based count of war dead may be incomplete; a survey estimate may be erroneous due to sampling bias – such things have nothing to do with relative passivity/activity. So …