Journalism, computer science students to unveil eight collaborative projects

Next week, journalism and computer science students from Northwestern’s “Collaborative Innovation in Journalism and Technology” class will unveil the prototypes they’ve built over the past 10 weeks. And you’re invited to see what they’ve come up with.

The students have been working since April, when I and my Knight Lab colleague, Associate Prof. Larry Birnbaum of the computer science department in the McCormick School formed eight interdisciplinary teams out of the 27 students enrolled in the class (11 journalism master's students, 13 computer science undergraduates, two engineering master's students and one undergraduate who's double majoring in journalism and computer science). We gave them a list of broadly defined project ideas, asked them for their preferences and tried to assign students to ideas they were most interested in.

The projects they will be presenting are:

  • ChatterTrack: Analyzes and visualizes what followers of a Twitter account are tweeting about.
  • Slimformation: Tracks the kinds of content a user is viewing and provides advice on how to improve his or her "information diet."
  • MusicRx: Modeled after the BookRx project developed at the Knight Lab, recommends music based on the content of a user's tweets.
  • Sensus: Helps journalists find newsworthy data in the U.S. Census.
  • twXplorer: Enables journalists to explore and save tweets about a topic they are interested in.
  • Timeoutline: Lets publishers tie together multiple articles about a topic over time into a timeline-based navigation system.
  • SportsTweet: Intended for a sports-centric broadcast, such as ESPN SportsCenter, visualizes the hot topics being discussed by sports fans on Twitter.
  • onMessage: Helps journalists or people interested in politics track the topics candidates talk about on the campaign trail.


In general, we strive for a mix of projects relevant to journalism and media: tools for journalists, software for publishers, and applications that could be useful or fun for media consumers.  We use an agile-development approach; teams are expected to present an updated version of their project each week.  Every team has a functioning prototype now. Between now and the final presentation, the students will be improving and extending what they've built so far.

This is the second time Larry and I have taught this course together. It's a lot of fun for us because we enjoy seeing the journalism and computer science students learn to communicate and collaborate. It's also fun because many projects end up evolving in ways the faculty could not have anticipated. Some projects end up being further developed by the professional staff here at the Knight Lab.

Please RSVP for the event on MeetUp (Details: 6:30pm (Chicago time), Wednesday, June 12 at the McCormick Tribune Center Forum). If you can't attend in person, we'll be streaming the presentation at http://bit.ly/Collab-Inno-Spring13.

We hope you'll join us, in person or on the live stream. Want to get a sense of the issues the students have been wrestling with? Check out the class blog, Tech Media Street.

About the author

Rich Gordon

Professor and Director of Digital Innovation

Journalism/tech intersection, my passion for 25 years, data journalism, Miami Herald web director, now hacker journalism.

Latest Posts

  • With the 25th CAR Conference upon us, let’s recall the first oneWhen the Web was young, data journalism pioneers gathered in Raleigh

    For a few days in October 1993, if you were interested in journalism and technology, Raleigh, North Carolina was the place you had to be. The first Computer-Assisted Reporting Conference offered by Investigative Reporters & Editors brought more than 400 journalists to Raleigh for 3½ days of panels, demos and hands-on lessons in how to use computers to find stories in data. That seminal event will be commemorated this week at the 25th CAR Conference, which...

    Continue Reading

  • Prototyping Augmented Reality

    Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to. We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring...

    Continue Reading

  • Capturing the Soundfield: Recording Ambisonics for VR

    When building experiences in virtual reality we’re confronted with the challenge of mimicking how sounds hit us in the real world from all directions. One useful tool for us to attempt this mimicry is called a soundfield microphone. We tested one of these microphones to explore how audio plays into building immersive experiences for virtual reality. Approaching ambisonics with the soundfield microphone has become popular in development for VR particularly for 360 videos. With it,...

    Continue Reading

  • Prototyping Spatial Audio for Movement Art

    One of Oscillations’ technical goals for this quarter’s Knight Lab Studio class was an exploration of spatial audio. Spatial audio is sound that exists in three dimensions. It is a perfect complement to 360 video, because sound sources can be localized to certain parts of the video. Oscillations is especially interested in using spatial audio to enhance the neuroscientific principles of audiovisual synchrony that they aim to emphasize in their productions. Existing work in spatial......

    Continue Reading

  • Oscillations Audience Engagement Research Findings

    During the Winter 2018 quarter, the Oscillations Knight Lab team was tasked in exploring the question: what constitutes an engaging live movement arts performance for audiences? Oscillations’ Chief Technology Officer, Ilya Fomin, told the team at quarter’s start that the startup aims to create performing arts experiences that are “better than reality.” In response, our team spent the quarter seeking to understand what is reality with qualitative research. Three members of the team interviewed more......

    Continue Reading

  • How to translate live-spoken human words into computer “truth”

    Our Knight Lab team spent three months in Winter 2018 exploring how to combine various technologies to capture, interpret, and fact check live broadcasts from television news stations, using Amazon’s Alexa personal assistant device as a low-friction way to initiate the process. The ultimate goal was to build an Alexa skill that could be its own form of live, automated fact-checking: cross-referencing a statement from a politician or otherwise newsworthy figure against previously fact-checked statements......

    Continue Reading

Storytelling Tools

We build easy-to-use tools that can help you tell better stories.

View More