30 tabs deep — How can we build a tool to track our journeys around the Internet?

These days curiosity is likely to lead you on a long trek through the depths of the Internet. You read one article and you stop at a shiny hyperlink that screams, “click me!” Before you know it, you are 30 tabs deep and way off topic. I value these journeys for the unexpected treasures that lie along the way, but sometimes the connection of that treasure to your origin isn’t clear.

Though you have the back button, your browser history, and perhaps multiple tabs to save you, it still may be hard to understand how all of the webpages connect to each other. How do we make them feel more connected? If you could visualize the connections you could see how the information spreads across the web, or, in the case of web articles, how the story has grown.

This fall, I set out to make a tool that would track an internet browsing session and then map out the journey a user took in the form of a force-directed graph. The nodes of the graph would represent the links that a user visited and the paths would show journey to and from these links in relation to the origin of the session. Ideally the nodes would grow larger based on the number of links that are connected to it and the links would be able to show the direction that the user took to get there, where they went, and whether or not they returned.

As I delved deeper into building out this idea, I came across more and more questions as to how this would work.

I figured that the user would want their browsing session to be undisturbed by the tool so the interface would have to be pretty easy and pretty discreet. The obvious direction to take was to to create a browser extension that would track your session and compile the data. The not so obvious part about that would be finding a place to display the graph of your session. Creating a web app with user profiles would allow not only for displaying the graphs but also save your sessions for later use.

When I started, my experience with any of the technologies that could produce this tool were slim to none. I decided to start with a Firefox extension to work on tracking sessions. I quickly learned about the limitations of browser extensions due to browser security protocols. I could easily grab the current URL of the user and grab the new one when a page was moved to, but I could not get the information from other tabs in the browser. This was a problem because it's not uncommon to see hyperlinks open the link in a new tab. The webpage in the new tab may have several hyperlinks itself and there isn't a way to differentiate a link in its relation to other links. When trying visualize the network, this would make it hard to differentiate between different paths of nodes. There are several libraries that allow for the production of force-directed graphs and I took a stab at using D3 because of how the graphs get rendered by a JSON object. The idea would be to expand upon that object to add a URL to every node. If you expand it to have to and from URL, you could allow for some differentiation in the paths of the links.

I am still looking for the correct data structure to accurately represent each web page, but the model for to and from links may be a start. The next steps I'll take are to modify how D3 renders the graphs to include these added features in the JSON object. The links between the nodes would then be determined by the "to" and "from" URLs. If everything goes according to plan then I will be able to render a network of webpages as a force-directed graph. But there is still the problem of writing the object and sending it to the web app for rendering.

As I delved deeper into building out this idea, I came across more and more questions as to how this would work. I think my next step should be to hit the drawing board again and focus more on the design of the whole tool (which is something I usually pass too quickly to start building). Ending up with a partially developed browser extension and some experimental D3 graphs did not point me in the right direction. I think that sticking with this project will help my development as a programmer and taking a swing at D3 could prove helpful for future work with data visualization. Hopefully some solid progress will be made in a few months time, especially with figuring out a good name for the project—that’s really what I get excited about!

About the author

Michael Martinez

Student Fellow

Latest Posts

  • With the 25th CAR Conference upon us, let’s recall the first oneWhen the Web was young, data journalism pioneers gathered in Raleigh

    For a few days in October 1993, if you were interested in journalism and technology, Raleigh, North Carolina was the place you had to be. The first Computer-Assisted Reporting Conference offered by Investigative Reporters & Editors brought more than 400 journalists to Raleigh for 3½ days of panels, demos and hands-on lessons in how to use computers to find stories in data. That seminal event will be commemorated this week at the 25th CAR Conference, which...

    Continue Reading

  • Prototyping Augmented Reality

    Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to. We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring...

    Continue Reading

  • Capturing the Soundfield: Recording Ambisonics for VR

    When building experiences in virtual reality we’re confronted with the challenge of mimicking how sounds hit us in the real world from all directions. One useful tool for us to attempt this mimicry is called a soundfield microphone. We tested one of these microphones to explore how audio plays into building immersive experiences for virtual reality. Approaching ambisonics with the soundfield microphone has become popular in development for VR particularly for 360 videos. With it,...

    Continue Reading

  • Prototyping Spatial Audio for Movement Art

    One of Oscillations’ technical goals for this quarter’s Knight Lab Studio class was an exploration of spatial audio. Spatial audio is sound that exists in three dimensions. It is a perfect complement to 360 video, because sound sources can be localized to certain parts of the video. Oscillations is especially interested in using spatial audio to enhance the neuroscientific principles of audiovisual synchrony that they aim to emphasize in their productions. Existing work in spatial......

    Continue Reading

  • Oscillations Audience Engagement Research Findings

    During the Winter 2018 quarter, the Oscillations Knight Lab team was tasked in exploring the question: what constitutes an engaging live movement arts performance for audiences? Oscillations’ Chief Technology Officer, Ilya Fomin, told the team at quarter’s start that the startup aims to create performing arts experiences that are “better than reality.” In response, our team spent the quarter seeking to understand what is reality with qualitative research. Three members of the team interviewed more......

    Continue Reading

  • How to translate live-spoken human words into computer “truth”

    Our Knight Lab team spent three months in Winter 2018 exploring how to combine various technologies to capture, interpret, and fact check live broadcasts from television news stations, using Amazon’s Alexa personal assistant device as a low-friction way to initiate the process. The ultimate goal was to build an Alexa skill that could be its own form of live, automated fact-checking: cross-referencing a statement from a politician or otherwise newsworthy figure against previously fact-checked statements......

    Continue Reading

Storytelling Tools

We build easy-to-use tools that can help you tell better stories.

View More