Data reporting and visualization (Page 8)
Gilad Lotan from startup incubator Betaworks and Kelly McBride from The Poynter Institute look at the role algorithms play in our lives. They can help us understand the world better or distort our perceptions of reality. “These algorithms’ embedded power is that they can draw attention, and they can attain attention, and we don’t exactly […]
This 14-slide presentation provides some quick tips on using numbers and math in your reporting. For example, memorize common numbers on your beat, learn to convert to simple ratios (1 out of 10) to keep your numbers small, and use devices from everyday life to make comparisons.
Maite Fernandez shares a few ideas from the NICAR conference on how to find data you need to report a story that aren’t there. Some include low-tech tactics as collecting screenshots and taking photos, as well building your own database from scratch.
Chrys Wu has aggregated a cheat sheet with lists of tutorials, tools, and samples of work from the National Institute for Computer-Assisted Reporting (NICAR) conference taking place in Baltimore. So, even if you’re not there, you can get started with Excel, learn to build maps with leaflet and mapbox.js, or even take a Python mini-bootcamp.
Damian Radcliffe shares five ways hyperlocals can do data journalism: create niche blogs or stories using data; use data to illustrate key points; unearth data using information requests; using networked journalism to fill in gaps; and using data as a tool for partnerships, with hyperlocals identifying stories as a grassroots level which then get escalated […]
The Guardian’s parent group is investing in a local digital training company that teaches people to “Code in a Day” or “Data in a Day,” services that also benefit its newsroom. As part of the deal, it’s also likely the Guardian will leverage its resources and knowledge in other ways. The firm may be involved […]
Understanding the audience via data is prominent in future-of-journalism talk. The Times is looking for help from a somewhat out-of-the-box source: a data scientist interested in the natural sciences and applying similar concepts to the real world. Columbia University associate professor Chris Wiggins will work for the Times on machine learning and predictive modeling. One […]
Adam Marcus compiled a list of some available tools for “scraping” data from text on web pages. He also explains the steps in web scraping, what it is, and where each of the tools (mostly free) fits in the process. While none of them is perfect, Marcus writes, “Where pure automation fails, a human can […]
“There are some amazing algorithms coming out the computer science community which promise to revolutionize how journalists deal with large quantities of information,” writes Jonathan Stray. “But building a tool that journalists can use to get stories done takes a lot more than algorithms.”
Web scraping is a process that extracts data from a Web page, and the folks at Kimono Labs built a tool that anyone can use. The tool is accessible from the browser, recognizes associations and relations among the data, and can be run and tested in the cloud.