Introducing JuxtaposeJS, an easy way to compare two frames

We’re pleased to announce JuxtaposeJS, a new Knight Lab tool that helps journalists tell stories by comparing two frames, including photos and gifs.

JuxtaposeJS is an adaptable storytelling tool and is ideal for highlighting then/now stories that explain slow changes over time (growth of a city skyline, regrowth of a forest, etc.) or before/after stories that show the impact of single dramatic events (natural disasters, protests, wars, etc.).

For example, check out this NASA image of the sun in its normal state and during a solar flare:

JuxtaposeJS is free, easy to use, and open source. Almost anyone can use JuxtaposeJS, so long as you’ve got links to two similar pieces of media (hosted on your own server or on Flickr). Once you've got the links, all you need to do is copy and paste the URLs into the appropriate fields at Juxtapose.knightlab.com, modify labels as you see fit, add photo credits, and then copy and paste the resulting embed code into your site.

Before we built Juxtapose, I looked at dozens of examples of before/after sliders and tried to take the best of each. With JuxtaposeJS you can modify the handle's start position to highlight the area of change and choose to click the slider instead of dragging. It works on phones and tablets with the touch or swipe of a finger.

You've probably seen similar tools elsewhere. They work well, but we built JuxtaposeJS without relying on jQuery, which makes the tool more lightweight, flexible, and adaptable. It’s also accessible to any newsroom or journalist, regardless of technical skills. Simply fill in the forms at Juxtapose.knightlab.com to build your slider. It's that easy.

The JuxtaposeJS interface is simple and allows you to immediately see the impact of your edits in the preview window.

JuxtaposeJS joins four other tools in Publishers' Toolbox, which we hope makes the suite of tools even more useful for journalists looking for quick-to-deploy, easy-to-use storytelling tools.

We hope you’ll find JuxtaposeJS useful and use it to make your stories more interesting and informative. If you use it, share your work with us at knightlab@northwestern.edu or @KnightLab.

About the author

Alex Duner

Student Fellow

Latest Posts

  • With the 25th CAR Conference upon us, let’s recall the first oneWhen the Web was young, data journalism pioneers gathered in Raleigh

    For a few days in October 1993, if you were interested in journalism and technology, Raleigh, North Carolina was the place you had to be. The first Computer-Assisted Reporting Conference offered by Investigative Reporters & Editors brought more than 400 journalists to Raleigh for 3½ days of panels, demos and hands-on lessons in how to use computers to find stories in data. That seminal event will be commemorated this week at the 25th CAR Conference, which...

    Continue Reading

  • Prototyping Augmented Reality

    Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to. We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring...

    Continue Reading

  • Capturing the Soundfield: Recording Ambisonics for VR

    When building experiences in virtual reality we’re confronted with the challenge of mimicking how sounds hit us in the real world from all directions. One useful tool for us to attempt this mimicry is called a soundfield microphone. We tested one of these microphones to explore how audio plays into building immersive experiences for virtual reality. Approaching ambisonics with the soundfield microphone has become popular in development for VR particularly for 360 videos. With it,...

    Continue Reading

  • Prototyping Spatial Audio for Movement Art

    One of Oscillations’ technical goals for this quarter’s Knight Lab Studio class was an exploration of spatial audio. Spatial audio is sound that exists in three dimensions. It is a perfect complement to 360 video, because sound sources can be localized to certain parts of the video. Oscillations is especially interested in using spatial audio to enhance the neuroscientific principles of audiovisual synchrony that they aim to emphasize in their productions. Existing work in spatial......

    Continue Reading

  • Oscillations Audience Engagement Research Findings

    During the Winter 2018 quarter, the Oscillations Knight Lab team was tasked in exploring the question: what constitutes an engaging live movement arts performance for audiences? Oscillations’ Chief Technology Officer, Ilya Fomin, told the team at quarter’s start that the startup aims to create performing arts experiences that are “better than reality.” In response, our team spent the quarter seeking to understand what is reality with qualitative research. Three members of the team interviewed more......

    Continue Reading

  • How to translate live-spoken human words into computer “truth”

    Our Knight Lab team spent three months in Winter 2018 exploring how to combine various technologies to capture, interpret, and fact check live broadcasts from television news stations, using Amazon’s Alexa personal assistant device as a low-friction way to initiate the process. The ultimate goal was to build an Alexa skill that could be its own form of live, automated fact-checking: cross-referencing a statement from a politician or otherwise newsworthy figure against previously fact-checked statements......

    Continue Reading

Storytelling Tools

We build easy-to-use tools that can help you tell better stories.

View More