A paper published in the Lancet, "Mortality after the 2003 invasion of Iraq: a cross-sectional cluster sample survey", has been much in the news lately. Researchers from Johns Hopkins, funded by MIT, have conducted a statistical survey of households in Iraq to estimate the number of deaths in the period immediately preceding, and following the 2003 invasion. Their estimate for the total number of excess deaths in the years since the invasion that might otherwise not have happened is more than an order of magnituded higher than any previous estimate: roughly 655,000 (95% confidence interval of 393,000 - 943,000).
Having read the article I can't find anything wrong with their methods -- indeed the paper is extremely well-written and carefully considered (as one would expect the editors to enforce on such a controversial topic). Furthermore, every single piece of media coverage I've seen that actually quotes an expert in statistics, polling or epidemiology compliments the study as being the best designed and most comprehensive to date on the topic.
Of course, other people take a different view:
PRESIDENT BUSH: "I don't consider it a credible report...the methodology was pretty well discredited."
GEN'L GEORGE CASEY (commander of US ground forces in Iraq): "[the death toll] seems way, way beyond any number that I have seen. I've not seen a number
higher than 50,000. And so I don't give that much credibility at all."
ALI AL DABBAGH (Iraqi gov't spokesman): "The report is unbelievable. These numbers are exaggerated."
It's a little worrying that the people in charge treat scientific research as lacking credibility just because it doesn't jive with what they expected the answer to be. Isn't this the point of science? To ask questions we don't know the answers to and then try to build upon the knowledge we gain? Shouldn't this report lead to an immediate follow-up to narrow that confidence interval and try to replicate the findings? Especially aggravating is the President claiming that the methodology is discredited, which is not only patently false, but sounds idiotic coming from a man who is clearly not an expert in statistical survey techniques.
In a broader sense, this is what drives me insane about politics and government: everything is driven by policy and electoral math, rather than by facts. Some researchers go out and do this incredibly dangerous study (it's not often that you read an academic paper with sentences such as "No interviewers died or were injured during the survey." in the Results section) and when they publish their shocking results, they're dismissed out of hand because they're inconvenient. Bah, I'm staying in science.