Megan Taylor

front-end dev, volunteacher, news & data junkie, bibliophile, Flyers fan, sci-fi geek and kitteh servant

Fixing Survey Stories

Bad survey stories are my pet peeve.

First of all, a journalist who copies numbers off a press release and does a quick rewrite without looking up the actual study is not doing her job.

I suspect the reason that this happen is that ‘journalists are afraid of math.’ This second part makes me furious. If someone as bad at math as I am can understand statistics (nothing too complex, but I was a psychology student for at least 4 years before I got into journalism) then ANY journalist should be able to understand percentages and ratios.

So I was thinking about how the way survey stories are done can be challenged.

538


Someone has already made a start on this: FiveThirtyEight is a blog with the mission of:

Most broadly, to accumulate and analyze polling and political data in way that is informed, accurate and attractive. Most narrowly, to give you the best possible objective assessment of the likely outcome of upcoming elections.

It is written by Nate Silver, who follows a published methodology in his analysis and does his best to stay unbiased. Every journalist who writes about numbers or politics should be reading this blog.

Error Reporting

MediaBugs is a project for correcting errors and problems in media coverage. The site is currently in beta, focusing on the Bay Area. They have a section for reporting “faulty statistics or math” (though nothing has been reported to that section yet).

But I would like a more aggressive approach.

Doing it Right

What about a site where survey stories that had been published were analyzed along with the original data and methodologies from the surveys?

There are dozens of questions that can be asked about a survey to find out how valid it might be. But honestly, I’d settle for describing the results of the survey correctly. And it wouldn’t hurt to point out that a sample of 3,000 women who walk into a particular store or buy a particular product is NOT a sample of all the women on Earth. It’s barely an acceptable random sample of women who shop at your store or buy a product!

It might be easier to clone Mr. Silver and create sites on various subjects, since his currently focuses on politics. Easier than teaching journalists a little stats, a little scientific method and a little self-respect.

But I think it would be more fun to build a site where crap surveys can be exposed as, well, crap. And good surveys can be lauded. And the results can be reported CORRECTLY. Shoot, I bet we could even get all kinds of college students to help crowd-source something like this.

Worst case scenario, I guess I’d have to do it myself. After I take a refresher course in social sciences.

Reblog this post [with Zemanta]
  • I'll be the first to admit my math skills aren't sterling (sorry, Mark Luckie) but I think this is yet another place where transparency has a huge part to play. I rarely—rarely, if ever—see survey story that includes a “how we arrived at these conclusions” section. Some people will take that as bias; I just take it as a hold-over from the print era when space was a concern and communicating the results was more important than describing the methodology.

    I don't mean to underscore your point—I absolutely think we need to call B.S. more often on crap reporting—but I also think that a cornerstone of good math skills is showing one's work.

%d bloggers like this: