I’ll start by saying that the Eurozone Meltdown was my favorite data journalism case study we read about. I love the human element, and how the project incorporated data, not as the story itself, but to support the human one-on-one story in a bigger way.
The story helped me realize that not all data visualization has to be centered around one big interactive Javascript project, but can also help provide an understanding of what is happening behind the story.
Sometimes data and journalism are very disconnected, and it goes both ways. Often, many newsrooms don’t have the time or money to invest in data-driven journalism. Something that was mentioned in the Open Knowledge Foundation case reminded me of this: “Advocacy groups, scholars and researchers often have more time and resources to conduct more extensive data-driven research than journalists.”
OK, it not-so-much reminded me of it, but rather acknowledged something that I felt as a small-town journalist. Most media outlets just don’t have the resources to go beyond their daily deadline and create something bigger. No one is paying them extra (certainly not a small-town paper) and they’re already working longer hours than your typical workweek job. It made sense to me that groups with special interest in the specific topic would be more willing to invest their time in a project.
Another quote, from the investigation of the EU funds, that stood out to me: “Before you tackle a project of this level of effort, you have to be certain that the findings are original, and that you will end up with good stories nobody else has.”
It’s so common sense, yet something that can easily be forgotten. Nine months of data journalism work isn’t worth anything if you’re not producing anything new. I tell my students something similar all the time about their writing. Their job is to inform – so they need to provide new information for their reader – otherwise they’re wasting both their time and the reader’s time.
Though I sort of knew how much work goes into some of these big projects, it was interesting to see just how much collaboration happens between coders, journalists, videographers, data sources, etc.
One more thing I found interesting, from the Open Knowledge Foundation: “Data released by governments is often incomplete or outdated.” Not surprising in a way, but still discouraging to hear. Like, why do governments waste their time on half-assed data? Why? What’s the point? Oh right, because, government.