Prototyping Augmented Reality

Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to.

We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring it out. Because we’re in such an early stage of defining the paradigms and patterns that make compelling AR content, it’s important to have tools and techniques that allow you to quickly evaluate your ideas and iterate on them. At Knight Lab, we use a combination of 2D and 3D tools to test our AR design hypotheses. This gives us the opportunity to test concepts and refine them before we’ve invested significant resources.

A paper prototype is the easiest and most accessible way to begin bringing your ideas into the world. Paper is cheap and convenient. Collaborators don’t need to learn how to use a new tool or technology before contributing to a paper prototype. However, paper can only get you so far when you’re working in AR because it lacks the ability to convey information about depth and three dimensionality. Working on paper also encourages you to think in frames. You might think this is ok for mobile AR, since the AR content you’ll be looking at will be framed by your phone’s screen. But, the frame in mobile AR is kind of a lie. In mobile AR, the user decides where to put the camera. If you spend significant time prototyping with a static, framed tool like a piece of paper, it’s natural to fall into the trap of designing beautifully composed vignettes that might never be viewed from your intended angle. To encourage your team to think about an AR experience as something that can be approached from infinite perspectives, it’s sometimes helpful to make cut outs in the dimensions of common phones and look through them while walking around the space where you intend to generate your AR experience.

VR tools can also be useful for prototyping AR experiences, especially if you’re going to be modeling your own 3D content. Apps like Google Blocks and Tilt Brush allow you to quickly produce create 3D content and walk around it in 6 degrees of freedom. You can even take a 360° photo of the place where your content will be triggered and set it as your background in Tilt Brush. However, a VR prototype isn’t very easy to share and walking around with a heavy headset strapped to your face is a fundamentally different experience than being outside with your phone. VR is a great tool for experimenting with 3D assets, but it isn’t ideally suited for prototyping the overall vibe of a large scale AR project.

A screenshot of the Vuforia app triggering with a marker inside Knight Lab.
This 3D model might augment the succulent garden on the roof of the Ellis Arts and Recreation Center in Bronzeville. Here we use a VuMark marker to preview it in Knight Lab.
While working on a recent project we discovered that AR markers are a useful prototyping tool because they allow you to quickly add, delete and reorder scenes. Instead of using GPS, marker based AR relies on your phone scanning an image in order to generate an AR experience. While a GPS based experience might feel more “magic” to the user, it’s also a lot less reliable. This is especially true indoors.

This past spring, we collaborated with community members from Chicago’s Bronzeville and Pilsen neighborhoods to prototype a mobile AR app to help users discover the history, culture and environment of those neighborhoods. This prototype needed to fulfill many different functions for many different people. We needed to demonstrate the value of investing in Augmented Reality to potential funders with little experience with in immersive media. We also needed to represent the multilayered story our community partners who wanted to tell about their neighborhoods. Additionally, the project spanned a large geographic area--one that was about 18 miles from our lab in Evanston. Using markers allowed us to build an experience that was portable and gave a clear representation of both the possibilities of the technology and the content. Since you can recreate a marker based experience anywhere you have a set of markers, we could iterate on it from campus and it could also be recreated in a potential funder’s office. The lightweight nature of markers is useful for user studies, too. If a user’s attention is lagging, you can just pull markers out of their path. Think a certain AR event might be better in a different location? You can just swap the marker, there’s no need to open up Unity and build a new version of your app.

These are a few techniques that have worked for us but we’d love to know how you’re working in AR. Do you have a workflow that really works for you? Have you checked out any of the brand new prototyping tools? Send us an email at knightlab@northwestern.edu.

About the author

Rebecca Poulson

Engineer

Rebecca leads AR/VR projects at Knight Lab. In addition to being a software developer and playwright, Rebecca is an Oculus Launchpad Fellow and Mozilla Tech Speaker. She has taught WebVR workshops on three continents.

Latest Posts

  • Prototyping Augmented Reality

    Something that really frustrates me is that, while I’m excited about the potential AR has for storytelling, I don’t feel like I have really great AR experiences that I can point people to. We know that AR is great for taking a selfie with a Pikachu and it’s pretty good at measuring spaces (as long as your room is really well lit and your phone is fully charged) but beyond that, we’re really still figuring...

    Continue Reading

  • Capturing the Soundfield: Recording Ambisonics for VR

    When building experiences in virtual reality we’re confronted with the challenge of mimicking how sounds hit us in the real world from all directions. One useful tool for us to attempt this mimicry is called a soundfield microphone. We tested one of these microphones to explore how audio plays into building immersive experiences for virtual reality. Approaching ambisonics with the soundfield microphone has become popular in development for VR particularly for 360 videos. With it,...

    Continue Reading

  • How to translate live-spoken human words into computer “truth”

    Our Knight Lab team spent three months in Winter 2018 exploring how to combine various technologies to capture, interpret, and fact check live broadcasts from television news stations, using Amazon’s Alexa personal assistant device as a low-friction way to initiate the process. The ultimate goal was to build an Alexa skill that could be its own form of live, automated fact-checking: cross-referencing a statement from a politician or otherwise newsworthy figure against previously fact-checked statements......

    Continue Reading

  • Northwestern is hiring a CS + Journalism professor

    Work with us at the intersection of media, technology and design.

    Are you interested in working with journalism and computer science students to build innovative media tools, products and apps? Would you like to teach the next generation of media innovators? Do you have a track record building technologies for journalists, publishers, storytellers or media consumers? Northwestern University is recruiting for an assistant or associate professor for computer science AND journalism, who will share an appointment in the Medill School of Journalism and the McCormick School...

    Continue Reading

  • Introducing StorylineJS

    Today we're excited to release a new tool for storytellers.

    A screenshot of a StorylineJS instance StorylineJS makes it easy to tell the story behind a dataset, without the need for programming or data visualization expertise. Just upload your data to Google Sheets, add two columns, and fill in the story on the rows you want to highlight. Set a few configuration options and you have an annotated chart, ready to embed on your website. (And did we mention, it looks great on phones?) As...

    Continue Reading

  • Join us in October: NU hosts the Computation + Journalism 2017 symposium

    An exciting lineup of researchers, technologists and journalists will convene in October for Computation + Journalism Symposium 2017 at Northwestern University. Register now and book your hotel rooms for the event, which will take place on Friday, Oct. 13, and Saturday, Oct. 14 in Evanston, IL. Hotel room blocks near campus are filling up fast! Speakers will include: Ashwin Ram, who heads research and development for Amazon’s Alexa artificial intelligence (AI) agent, which powers the...

    Continue Reading

Storytelling Tools

We build easy-to-use tools that can help you tell better stories.

View More