Designing tools for investigation at the Hacks/Hackers Buenos Aires Media Party

Above, Bike Storming's Mati Kalwill and the Lab's Joe Germuska exchange ideas and show each other project demos at the #hhba #mediaparty media fair.

As I recently wrote, last week Joe and I had the privilege to participate in the Hacks/Hackers Buenos Aires Media Party. We prepared a couple of talks and spoke to the group: mine was about the current state of Knight Lab, and Joe's was about the future of journalism. We also prepared and facilitated a workshop on designing tools for investigation.

Diseño herramientas de software de investigación periodística

We really like this workshop. Basically, since the beginning of the year we have been running various versions of this workshop for both our internal team at the Lab — when we need to think more critically about an idea or research area — and externally as a means to work with the journalism community. We like it because it simultaneously generates big ideas but also teaches proper thinking around technology projects.

The process explained simplistically, we begin the workshop by defining the theme of the discussion. At this event in Buenos Aires, we chose to focus on generating ideas around journalistic investigation, information gathering and research. We explain the workshop process as being a step-by-step series of brainstorms that will become more and more granular as we progress through each step in the process.

We try to focus the design process on solving specific problems for a specific kind of person. We call this abstract person our persona, and we begin by collectively generating the defining characteristics of that persona, writing them in a place that the entire group can see (like a wall, white board, chalkboard or projected on to a screen).

We break up into smaller groups and instruct each group to do some very high-level brainstorming for tool ideas and articulate some specific use-cases or situational scenarios.

After five or ten minutes, we regroup and someone from each team reports back the ideas that were generated. We write them out for everyone to see, and consolidate related ideas, resulting in a handful of possible solutions that could be designed further.

For the rest of the exercise, it is best to divide into groups of about five people. At this event, we had about 15 people who agreed to stick around for the full duration of the workshop — we insist on this level of commitment — so, we chose the three best ideas and asked for three people to volunteer as a captain for other interested attendees to gather around.

At this point we ask the groups to start generating all of the features for their tool. We ask them to think in terms of very basic tasks: a user would need to be able to input data manually, batch upload data, edit data, view data, export data, etc. Each feature is written on its own index card or post-it note. The group is instructed to think of every single feature that comes to mind and write them down, without focus on practicality or methods for execution.

Next, we ask the teams to group the cards into four areas: must-have features, should-have features, nice-to-haves, and "meh." (Sometimes for instructional workshops we leave out the "should-have" priority.) This part of the process is helpful in establishing the most fundamental features: the ones it absolutely must have in order for it to solve a given challenge, scenario or use-case. It forces the team to be think pragmatically and helps the individual team members clarify their intentions and expectations.

When we're applying this process to projects at the Lab, we also estimate the effort each feature will require and the degree of risk or uncertainty involved in implementing it. For workshops, there is rarely enough time, and also participants don't always have enough practical experience to do this step effectively, so we usually conclude by asking each group to report back with a three- to five-sentence summary of the idea and how it solves a specific use case for a defined user, plus a quick list of the necessary features that the tool must include for it to be viable.

Our Hacks/Hackers Buenos Aires Media Party workshop was a lot of fun and we were excited to see our friend Nuno Vargas running a design-thinking workshop as well. We had a great time and felt grateful to spend time with great journalists. The following are summaries from each of the three ideas:

A tool for sourcing information in places with limited access to technology

Team: Jaime González A., Mariana Santos, Matias Kalwill, and Mariana Mas

The group began designing a system that would help gather crowd-sourced information in places and communities with limited or low access to technology. They felt it had to have the ability to work offline, batch upload and download data, identify information providers (both the producer and the interviewed), capture audio/photos/video, capture timestamps and geolocation data, and easily create new forms. They felt it would be nice to have the ability to track progress, provide translation, retrieve data from databases, group/tag/score entries, provide basic data visualization and a preview mode.

Public reputation market

Team: Friedrich Lindenburg, Douglas Arellanes, Annabel Church, and Daniel Carranza

A public reputation market for journalists and sources which would allow a source to give a journalist feedback through evaluation. The idea was a CRM-style mechanism to centralize, rate, and track the sources available to a newsroom. The group liked the idea of an Airbnb-like 'rate my reporter' feature that would allow reporter's sources to rate or grade their interview experience. The system would notify the news organization's ombudsman if a journalist received a negative evaluation. The content within the system would include affiliations, subjects/topics, links and an ability to indicate whether the source was on- or off-the-record. A fully-featured system could include aggregate rating averages for a given news organization, in a sense allowing readers and sources to have a sense of trust-worthiness. It would have a source dashboard and contact tracker. It would correlate sources to any potential corrections from the news organization.

A system that attempts to optimize the investigation and research process

Team: Juan Pablo Dalmasso, Aigul Safiullina, Ana Arriagada, Fernando Correa, Lucas Palero and Guillermo Villamayor

This ambitious team wanted a system that would save time for the researcher by doing suck things like helping to find new angles, surface relevant information and highlight sources. They were calling it "Investigation Optimizer" and seem to have defined the system around search. Some of its necessary features included a workflow that defined the kinds of data of interest, orders and prioritizes data, manages sources and quotes, an index of public data and social media, and a web interface. They reported that the it would be nice if the system also allowed for search filtering (like relevance, chronology, known sources, etc), topic grouping and structuring of results, ability to save previous search queries, and a highlighting feature for popular quotes and data sources.

Special thanks to Joe Germuska for his contribution to this writeup.

About the author

Miranda Mulligan

Executive Director, 2012-2014

Latest Posts

  • Prototyping Augmented Reality

    Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to. We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring...

    Continue Reading

  • Capturing the Soundfield: Recording Ambisonics for VR

    When building experiences in virtual reality we’re confronted with the challenge of mimicking how sounds hit us in the real world from all directions. One useful tool for us to attempt this mimicry is called a soundfield microphone. We tested one of these microphones to explore how audio plays into building immersive experiences for virtual reality. Approaching ambisonics with the soundfield microphone has become popular in development for VR particularly for 360 videos. With it,...

    Continue Reading

  • How to translate live-spoken human words into computer “truth”

    Our Knight Lab team spent three months in Winter 2018 exploring how to combine various technologies to capture, interpret, and fact check live broadcasts from television news stations, using Amazon’s Alexa personal assistant device as a low-friction way to initiate the process. The ultimate goal was to build an Alexa skill that could be its own form of live, automated fact-checking: cross-referencing a statement from a politician or otherwise newsworthy figure against previously fact-checked statements......

    Continue Reading

  • Northwestern is hiring a CS + Journalism professor

    Work with us at the intersection of media, technology and design.

    Are you interested in working with journalism and computer science students to build innovative media tools, products and apps? Would you like to teach the next generation of media innovators? Do you have a track record building technologies for journalists, publishers, storytellers or media consumers? Northwestern University is recruiting for an assistant or associate professor for computer science AND journalism, who will share an appointment in the Medill School of Journalism and the McCormick School...

    Continue Reading

  • Introducing StorylineJS

    Today we're excited to release a new tool for storytellers.

    StorylineJS makes it easy to tell the story behind a dataset, without the need for programming or data visualization expertise. Just upload your data to Google Sheets, add two columns, and fill in the story on the rows you want to highlight. Set a few configuration options and you have an annotated chart, ready to embed on your website. (And did we mention, it looks great on phones?) As with all of our tools, simplicity...

    Continue Reading

  • Join us in October: NU hosts the Computation + Journalism 2017 symposium

    An exciting lineup of researchers, technologists and journalists will convene in October for Computation + Journalism Symposium 2017 at Northwestern University. Register now and book your hotel rooms for the event, which will take place on Friday, Oct. 13, and Saturday, Oct. 14 in Evanston, IL. Hotel room blocks near campus are filling up fast! Speakers will include: Ashwin Ram, who heads research and development for Amazon’s Alexa artificial intelligence (AI) agent, which powers the...

    Continue Reading

Storytelling Tools

We build easy-to-use tools that can help you tell better stories.

View More