The bogus tweets and photos that marred social media coverage of events like the Arab uprising and Hurricane Sandy have spurred the development of a number of fact-checking tools. Recently, one of them caught the attention of the Knight Foundation, which awarded Checkdesk a Knight News Challenge grant.

Checkdesk, which provides verification tools for photos, is already in use but developers are interested in expanding and improving it. Tom Trewinnard, a manager at Meedan, a nonprofit working on the Checkdesk project, recently answered some questions about the plans.

Congratulations on your grant from the Knight Foundation. Clearly, building an application like this is expensive. What are the the primary costs associated with creating and launching such a project?

TOM TREWINNARD: Thanks – we’re very excited! The grant from Knight will allow us to test some of the ideas we pitched as part of the News Challenge on a small scale. To that end the major costs associated with this stage of our work are going to be research, several rounds of design work, and testing our designs and any prototypes we make. This Knight support will help us find and fix problems earlier in our development process, saving money and energy further down the line as we build diverse features within our core Checkdesk framework.

Asking this for interested newsrooms everywhere: Is there a fee for news organizations that want to implement Checkdesk in their newsrooms? Who does the training and implementation, and how long does that take?

TREWINNARD:Great question – Checkdesk is an open source project and the code is all on GitHub (https://github.com/meedan/meedan-checkdesk) so newsrooms are welcome and encouraged to play around with it. If you want our help implementing Checkdesk then please get in touch!

The question of training touches on two issues: Firstly, training on Checkdesk itself, which shouldn’t take long at all; and secondly, training in verification techniques for User Generated Content. The second question is one of the things we’re looking to tackle with our Knight prototype – we want to make a checklist of verification best practices for a range of different media, in order to help guide journalists through the verification process. The tools should facilitate the training.

The first kind of training can be done by the Meedan team; the second can be done by a range of people, from Meedan to the European Center for Journalism (publishers of the Verification Handbook) and other groups working in verification. We have some open-licensed, Arabic-language resources relevant to this here, courtesy of our fantastic research and training partners Birmingham City University (UK).

Checkdesk incorporates tools such as Tineye, EXIF data, FotoForensics, satellite imagery, Wolfram Alpha. This sounds intimidating! Can people without advanced technical skills effectively use Checkdesk?

TREWINNARD: We’re building up to integrate these in a way that is “magical” to the end-user. To simplify the concept we are focused on developing checklists: you see a photo that you’re not sure about it, drop the link into Checkdesk, and you’re given the video embedded on your own system and a checklist of steps that you can take to verify (or better verify) that image. Over time we will incorporate more sophisticated features within this same interaction pattern.

At the moment if you want to verify online content, you have to use Tineye (or Google’s Reverse Image Search) and reveal EXIF. A multiple-step-process can be arcane to learn and easy to ignore while a story is breaking. By incorporating third-party services into Checkdesk and the best-practice checklists we’re really aiming to reduce the friction for journalists between having a photo that might be useful for your work, and going through the steps to make sure you can use that picture safely and responsibly. Checkdesk is designed for skilled scientific work, but with a push-button-simple interface that is useful even to the newbies.

In general, though, we think that the verification techniques that are commonly used aren’t highly technical: Wolfram Alpha is almost as easy to use as Google Search and could provide you with historical weather data that could debunk a video; there are image forensic tools that can very quickly give you a clear indication of whether a photo has been altered; Tineye can show you if the image you’re using has appeared anywhere else on the internet before.

These are all simple, low-tech steps that all journalists should be using if they are working with photos. Checkdesk verification checklists are intended to help distill these processes and skills together into a more sensible workflow.

Was there a particular event or journalistic mishap that initially sparked the idea for Checkdesk? And what mistakes do you think Checkdesk could have prevented?

At Meedan we started working on these ideas during the early stages of the Arab uprisings. At the time we were translating a lot of social media coming out of Egypt, Syria, Libya and Tunisia, and we became very conscious that we didn’t want to be propagating dangerous rumors and misinformation. We realized that some of the content emerging probably couldn’t be verified in a timely manner, but we wanted to do something more helpful and transparent than the inadequate and overused caveat: “This media couldn’t be independently verified.”

We believe Checkdesk can help avoid two kinds of mistakes:

(1) Newsrooms using unverified UGC that later turns out to be false or inaccurate. We’ve seen mistakes from publishers across the board, including MailOnline (the world’s most visited news site) and the BBC. The errors are devastating to public trust but can be avoided in many cases by performing the right series of automated checks. Red flags about a media item’s authenticity can be raised much earlier in the publication cycle.

2) Users of social networks (including journalists and newsrooms) retweeting, sharing and propagating fake content during breaking news events. After events like Hurricane Sandy and the Boston Marathon bombing, misinformation and rumor appeared and spread rapidly online, especially via Twitter and Facebook. Research on the aftermath of the Boston attacks suggest that 29% of viral content on Twitter was rumor and fake content. The severity of the consequences can be observed in the impact of a tweet sent from a hacked AP account — $130 billion gone in a matter of seconds.

Obviously no tool in isolation is going to fix the issues that lie at the heart of these mistakes, but we believe that if we can work with journalists and editors to build great tools then we can move in the right direction.

Are any U.S. news organizations currently using CheckDesk?

TREWINNARD: Thus far we’ve been working with some great newsrooms, media collectives and J-schools in the Middle East and North Africa, but we’re always looking for collaborators and testers. We are expanding to include US partners in 2014. So if you want to talk about Checkdesk, verification and UGC, contact telrumi@meedan.net — we’d love to hear from anyone interested, whether from large-scale media operations, local news, independent journalists, journalism schools, citizen journalists or public interest groups.

You might also be interested in: