SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Advanced Micro Devices -- Ignore unavailable to you. Want to Upgrade?


To: sylvester80 who wrote (1191472)1/7/2020 12:09:33 PM
From: longnshort1 Recommendation

Recommended By
FJB

  Read Replies (1) | Respond to of 1573494
 
Fudged statistics on the Iraq War death toll are still circulating today

April 13, 2018 10.50am EDT

Author Michael SpagatProfessor of Economics and Head of Department, Royal Holloway


Disclosure statementMichael Spagat does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Partners

Royal Holloway provides funding as a member of The Conversation UK.

View all partners

Who’s counting? EPA/Stefan Zaklin
Email Twitter Facebook LinkedIn Print
What happens when a scientific journal publishes information that turns out to be false? A fracas over a recent Washington Post article provides an illuminating case study in how, even years after they’re published, uncorrected false claims can still end up repeated time and again. But at the same time, it shows how simply alerting responsible journalists and news editors to repeated errors can do a lot to combat false claims that stubbornly live on even after they’ve been debunked.

It all started with a 2006 article published in the eminent journal The Lancet, entitled Mortality after the 2003 invasion of Iraq: a cross-sectional cluster sample survey. The article had many problems, but one of its graphs in particular stuck out. That one figure displayed new estimated numbers of violent deaths in the Iraq War, and came up with numbers massively higher than anything anyone had seriously suggested before.

And although the article’s reported violence numbers increased over time far more rapidly than those reported by other sources, including the Iraq Body Count (IBC) project, the graph gave the inaccurate impression that IBC trends actually tracked their new data quite closely – ostensibly validating what, at first glance, seemed like a very hard-to-swallow new dataset.

In addition, the graph included a third dataset purporting to show violence trends measured by the US Department of Defence (DoD), trends that were again presented as consistent with the authors’ new data. The finished graph was central to the paper’s effort to “mainstream” the shocking new numbers by connecting them with other data on war violence.

Yet a few weeks after the article was published, letters sent to the Lancet from other researchers discredited the graph entirely.

Falling apartFirst, Debarati Guha-Sapir and two colleagues pointed out that the graph used two Y axes, a device notorious for creating the illusion that two curves moving in the same direction at different speeds are in fact moving at the same speed.

But this was just one of the problems with the graph. The article’s authors also tweaked the trends by comparing their own data with cumulative IBC data. Specifically, they plotted the first 13 months of their data against 13 months of IBC data, their second 13 months against 26 months of IBC data, and their third 13 months against 39 months of IBC data.

And in another published letter to the journal, IBC’s Joshua Dougherty demonstrated that the DoD curve the authors included did not represent what they said it did. For example, going back to the source, he said the DoD data they cite include both deaths and casualties, and “do not offer any direct means by which to calculate what number might be deaths, let alone civilian deaths”.

Rather surprisingly, in their reply – also published in the Lancet – the authors actually admitted to several problems with their graph. But their mea culpa was grudging and incomplete. In it, they suggested that the issues with the graph are merely technical, and that the trends really do match – but as the letters mentioned above pointed out, that is not correct.

One would think that by this point, deafening alarm bells would have been ringing in the Lancet editors’ ears. After all, they had just published a highly compromised graph that two letters had utterly discredited. And the authors had even admitted errors, an unusual development to say the least. Surely the Lancet would withdraw the graph; maybe they would at least leave it up online but post a warning sign nearby. But no. Instead, they just left the article and the graph online in perpetuity.

And so, in spring 2018, enter the Washington Post and its reporter Philip Bump.

Caught outTo his credit, Bump wrote and published an article marking the 15th anniversary of the invasion of Iraq, an event that’s getting far too little media attention in the UK and US. The article reads well – that is, until it landed on the discredited graph, which it reproduced and presented as if it were legitimate.

The Lancet could argue that if Bump had only read the follow-up letters it published, he never would have reprinted the discredited graph. But this argument is akin to saying that there is no need for warning labels on cigarettes because people can just read the scientific literature on smoking and consider themselves warned. But in practice, many people will just assume the graph is kosher because it sits on the Lancet website with no warning attached.

A scene in Fallujah, July 2003. EPA/Jamal NasrallahAs you might expect, the dust-up over the graph is just the tip of the iceberg with the 2006 article. I myself have comprehensively debunked the article, with at least some of the inaccuracies that propped it up. The article’s lead author, Gilbert Burnham, was censured by the American Association for Public Opinion Research for refusing to explain basic elements of his methodology. He was also sanctioned by Johns Hopkins, which “suspended Dr. Burnham’s privileges to serve as a principal investigator on projects involving human subjects research”. Johns Hopkins also said it would send an erratum to the Lancet to address inaccuracies in the article’s text.

Bump did not mention any of this fallout from the article he cited; presumably he just didn’t come across it in the course of his own research. But maybe he would have been primed to dig deeper if the Lancet had done all it could to label the graph with an appropriate warning. Letters to a journal are better than nothing, but they are not enough to correct a published false claim; it is incumbent on all involved to flag inaccuracies and misrepresentions as conspicuously as they can.

That said, this particular chapter at least has a happy ending. I wrote to Bump and the Washington Post and they fixed the story, in the process demonstrating an admirable respect for evidence and a commitment to the truth. The Lancet would do well to follow their example.



To: sylvester80 who wrote (1191472)1/7/2020 12:11:26 PM
From: longnshort1 Recommendation

Recommended By
FJB

  Read Replies (1) | Respond to of 1573494
 
2) A study released in March of 2003 by a British medical journal, the Lancet, showed that 100,000 civilians had been killed as a result of the US invasion.To be perfectly frank, it’s hard to see how anyone who has even a passing familiarity with statistics could take Lancet’s numbers seriously.: Fred Kaplanfrom Slate explains:



“The authors of a peer-reviewed study, conducted by a survey team from Johns Hopkins University, claim that about 100,000 Iraqi civilians have died as a result of the war. Yet a close look at the actual study, published online today by the British medical journal the Lancet, reveals that this number is so loose as to be meaningless.The report’s authors derive this figure by estimating how many Iraqis died in a 14-month period before the U.S. invasion, conducting surveys on how many died in a similar period after the invasion began (more on those surveys later), and subtracting the difference. That difference’the number of “extra” deaths in the post-invasion period’signifies the war’s toll. That number is 98,000. But read the passage that cites the calculation more fully:

We estimate there were 98,000 extra deaths (95% CI 8000-194 000) during the post-war period.

Readers who are accustomed to perusing statistical documents know what the set of numbers in the parentheses means. For the other 99.9 percent of you, I’ll spell it out in plain English’which, disturbingly, the study never does. It means that the authors are 95 percent confident that the war-caused deaths totaled some number between 8,000 and 194,000. (The number cited in plain language’98,000’is roughly at the halfway point in this absurdly vast range.)

This isn’t an estimate. It’s a dart board.

Imagine reading a poll reporting that George W. Bush will win somewhere between 4 percent and 96 percent of the votes in this Tuesday’s election. You would say that this is a useless poll and that something must have gone terribly wrong with the sampling. The same is true of the Lancet article: It’s a useless study; something went terribly wrong with the sampling.”



Bingo! What Lancet was in effect saying was that they believed 98,000 civilians died, but they might have been off by roughly 90,000 people or so in either direction.

Moreover, other sources at the time were coming in with numbers that were a tiny fraction of the 98,000 figure that the Lancet settled on. From a: New York Times: article on the Lancet study:



“The 100,000 estimate immediately came under attack. Foreign Secretary Jack Straw of Britain questioned the methodology of the study and compared it with an Iraq Health Ministry figure that put civilian fatalities at less than 4,000. Other critics referred to the findings of the Iraq Body Count project, which has constructed a database of war-related civilian deaths from verified news media reports or official sources like hospitals and morgues.That database recently placed civilian deaths somewhere between 14,429 and 16,579, the range arising largely from uncertainty about whether some victims were civilians or insurgents. But because of its stringent conditions for including deaths in the database, the project has quite explicitly said, ”Our own total is certain to be an underestimate.”

Via: GlobalSecurity.org, here’s another Iraqi civilian death estimate:



“On 20 October 2003 the Project on Defense Alternatives estimated that between 10,800 and 15,100 Iraqis were killed in the war. Of these, between 3,200 and 4,300 were noncombatants — that is: civilians who did not take up arms.”

Given all that, how any informed person can buy into Lancet’s numbers is simply beyond me.