Bad survey stories are my pet peeve.
First of all, a journalist who copies numbers off a press release and does a quick rewrite without looking up the actual study is not doing her job.
I suspect the reason that this happen is that ‘journalists are afraid of math.’ This second part makes me furious. If someone as bad at math as I am can understand statistics (nothing too complex, but I was a psychology student for at least 4 years before I got into journalism) then ANY journalist should be able to understand percentages and ratios.
So I was thinking about how the way survey stories are done can be challenged.
Someone has already made a start on this: FiveThirtyEight is a blog with the mission of:
Most broadly, to accumulate and analyze polling and political data in way that is informed, accurate and attractive. Most narrowly, to give you the best possible objective assessment of the likely outcome of upcoming elections.
It is written by Nate Silver, who follows a published methodology in his analysis and does his best to stay unbiased. Every journalist who writes about numbers or politics should be reading this blog.
MediaBugs is a project for correcting errors and problems in media coverage. The site is currently in beta, focusing on the Bay Area. They have a section for reporting “faulty statistics or math” (though nothing has been reported to that section yet).
But I would like a more aggressive approach.
Doing it Right
What about a site where survey stories that had been published were analyzed along with the original data and methodologies from the surveys?
There are dozens of questions that can be asked about a survey to find out how valid it might be. But honestly, I’d settle for describing the results of the survey correctly. And it wouldn’t hurt to point out that a sample of 3,000 women who walk into a particular store or buy a particular product is NOT a sample of all the women on Earth. It’s barely an acceptable random sample of women who shop at your store or buy a product!
It might be easier to clone Mr. Silver and create sites on various subjects, since his currently focuses on politics. Easier than teaching journalists a little stats, a little scientific method and a little self-respect.
But I think it would be more fun to build a site where crap surveys can be exposed as, well, crap. And good surveys can be lauded. And the results can be reported CORRECTLY. Shoot, I bet we could even get all kinds of college students to help crowd-source something like this.
Worst case scenario, I guess I’d have to do it myself. After I take a refresher course in social sciences.
Related articles by Zemanta
- Introduction To Survey Research (slideshare.net)
- Getting What you Need From a Survey A Primer on Using a Survey to Gather Information (questionpro.com)
- Ethical and data-integrity problems in a study of mortality in Iraq (stat.columbia.edu)
- Study: public relations more important than ever in the new newsroom (newswire.ca)