What's next for twXplorer? Help us decide.

twXplorer screenshot

Just over two weeks ago we launched twXplorer, a tool to help people make sense of searches and find interesting conversations on Twitter.

When we launched the tool we didn’t know how it would be received or what use people would find for it. So far, we've been pretty happy to have more than 13,000 people use twXplorer and to get a few kind words from The Atlantic (“control your own little battalion of news-finding bots”), The Buttry Diary ("Twitter search just got waaaay better"), All Twitter, and Spain’s RedAssociales.

Kind words are great, of course, but nothing beats honest feedback. If you've used the tool and have some ideas or special use cases that twXplorer might address, we'd love to hear from you. We're likely to make a few tweaks to twXplorer and your feedback will help us determine which direction to take. Drop us a line at KnightLab@northwestern.edu.

In the meantime, a few of the features requested so far:

  • Search date or time range. This is a common request, but unfortunately it's not something we can implement for the simple reason that Twitter's API doesn't support it. A common related question had to do with the number of tweets examined. For now twXplorer looks the most recent 500 tweets that contain the term you searched for.
  • Search by location. Currently twXplorer collects tweets that contain the term you searched for without regard for location. We’ve learned from other projects — specifically NeighborhoodBuzz — that relatively few people geotag tweets, which makes a meaningful analysis difficult. On top of that the search API only supports search by a radius around a point, which is not how most people think about doing searches.
  • An API. There’s some interest in an API. No word on what aspect of twXplorer is most intriguing, but we’re reaching out to folks who use the tool and will learn more as we do.
  • Integration with Storify. Some people want to Storify tweets directly from twXplorer.
  • Sentiment analysis.

  • A better block list — the omission list of most frequent terms – for twXplorer?

  • A better block list. Paul Watson at Storyful looked at the results of a Twitter list analysis and noted that the list of most frequent terms was pretty weak, including words like “says” “get” “new” and “want” among the top five.
  • Analysis over time. In a slight twist from the first bullet in this list, one user wanted to capture the most common hashtags used by her twitter lists over the course of a week. Using the twXplorer as it’s currently configured she could achieve the same result by taking a snapshot a few times each day, but that’s an awful lot of work. Maybe scheduling searches is useful?
  • Download search results/analysis as JSON.
  • Visualizations across snapshots and permalinks to snapshots.
  • Better handling of trigrams. Those of us who are new to computational linguistics are getting a light vocabulary lesson. We’ve got unigrams (single words), bigrams (two-word phrases) and trigrams (three-word phrase). Some folks want us to do a better job with these longer phrases.
  • Facebook equivalent?

Again, that's a pretty long list and we'll be digging deeper over the next few weeks to learn more about how people actually use the tool.

Remember, if you're a twXplorer user, you can help shape its future. What would make the tool really useful for you? What part of Twitter search is the most painful to deal with, either on Twitter itself or on twXplorer? What do you consistently find yourself wanting when you search?

Have an answer? Drop us a line at knightlab@northwestern.edu.

About the author

Ryan Graff

Communications and Outreach Manager, 2011-2016

Journalism, revenue, whitewater, former carny. Recently loving some quality time @KelloggSchool.

Latest Posts

  • With the 25th CAR Conference upon us, let’s recall the first oneWhen the Web was young, data journalism pioneers gathered in Raleigh

    For a few days in October 1993, if you were interested in journalism and technology, Raleigh, North Carolina was the place you had to be. The first Computer-Assisted Reporting Conference offered by Investigative Reporters & Editors brought more than 400 journalists to Raleigh for 3½ days of panels, demos and hands-on lessons in how to use computers to find stories in data. That seminal event will be commemorated this week at the 25th CAR Conference, which...

    Continue Reading

  • Prototyping Augmented Reality

    Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to. We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring...

    Continue Reading

  • Capturing the Soundfield: Recording Ambisonics for VR

    When building experiences in virtual reality we’re confronted with the challenge of mimicking how sounds hit us in the real world from all directions. One useful tool for us to attempt this mimicry is called a soundfield microphone. We tested one of these microphones to explore how audio plays into building immersive experiences for virtual reality. Approaching ambisonics with the soundfield microphone has become popular in development for VR particularly for 360 videos. With it,...

    Continue Reading

  • Prototyping Spatial Audio for Movement Art

    One of Oscillations’ technical goals for this quarter’s Knight Lab Studio class was an exploration of spatial audio. Spatial audio is sound that exists in three dimensions. It is a perfect complement to 360 video, because sound sources can be localized to certain parts of the video. Oscillations is especially interested in using spatial audio to enhance the neuroscientific principles of audiovisual synchrony that they aim to emphasize in their productions. Existing work in spatial......

    Continue Reading

  • Oscillations Audience Engagement Research Findings

    During the Winter 2018 quarter, the Oscillations Knight Lab team was tasked in exploring the question: what constitutes an engaging live movement arts performance for audiences? Oscillations’ Chief Technology Officer, Ilya Fomin, told the team at quarter’s start that the startup aims to create performing arts experiences that are “better than reality.” In response, our team spent the quarter seeking to understand what is reality with qualitative research. Three members of the team interviewed more......

    Continue Reading

  • How to translate live-spoken human words into computer “truth”

    Our Knight Lab team spent three months in Winter 2018 exploring how to combine various technologies to capture, interpret, and fact check live broadcasts from television news stations, using Amazon’s Alexa personal assistant device as a low-friction way to initiate the process. The ultimate goal was to build an Alexa skill that could be its own form of live, automated fact-checking: cross-referencing a statement from a politician or otherwise newsworthy figure against previously fact-checked statements......

    Continue Reading

Storytelling Tools

We build easy-to-use tools that can help you tell better stories.

View More