How to create prototypes for AR using Keynote and a smartphone
A step by step guide for budding AR designers.

Designing for AR is perhaps one of the most interesting applications of UX. As this incredible technology is being put to use for unique applications, UX Designers are tasked with creating user interfaces for an augmented experience, that does set guidelines at the moment. This requires a lot of design iterations and prototyping to get to a solution that works. But, how can you prototype for AR which exists in digital space only visible through the screens of our devices? Solutions like paper prototyping or digital prototypes do not assist designers in testing the interactions in a physical context where the application will be used. And physical context is a core essential for augmented reality.
How can you prototype for AR which exists in digital space only visible through the screens of our devices?
My two amazing colleagues (Lidong Liu and Liang Ce) and I recently faced this very problem, when we were designing an application where a person could “try-out” different hairstyles on themselves using AR. Now, in this article, I don’t want to talk about our findings or the design process (which you can read about here). Rather, let’s focus on how we created prototypes for the project in a way that we could test them. I believe this method will help a lot of AR UX Designers out there.
When the time arrived to create the user flow and UI for screens where the user can edit their hairstyle, we were stuck. We couldn’t find quick prototyping tools for design concepts that use augmented reality. Using low-fidelity wireframes did not provide enough physical context where we can hold it in a hand and actually look at our hair being digital edited. Tools like ARKit or ARCore would be time and resource-intensive for us to quickly test basic interactions and concept directions. We soon started using the front-facing camera and imagined interacting with the screen. We liked this idea and really wanted to try these interactions on top of the camera feed on the screen. That’s when we thought of an innovative way to create such prototypes.
By simply using our smartphones and Keynote, we created an efficient and tactile prototyping method for AR applications. By recording ourselves through the camera, we can create prototypes that can provide a lot of insights. Here’s a 4 step process of how we used our creativity to create an A.R. prototype for our project.
1. Recording videos of the subject
I had long hair at that time and I could style my hair in a lot of ways. You can also use wigs or make up for this part. We took multiple videos of me, changing my hairstyle for each one of them.

Note: Make a storyboard or a task list so that you can keep a check on the number of videos to create.
2. Exporting UI wireframe and components
After creating wireframes for our hairstyle editing concepts, we exported them as PNG images. It’s important to check which wireframe components are going to be on top of the camera feed. These components but have transparent areas in them through which the camera feed should be visible. In our case, we place a transparent area as shown in the image below.

Note: Components and widgets that move on the wireframe will have to be exported separately.
3. Adding video to the UI
Now that we have the videos and the wireframes, it was time to put them together. But we did not use any design tools that would be a given at this point. Instead, we created the prototype in Keynote. Yes, Keynote, The presentation application on macOS! Create a new keynote on the app and change the size of the slide deck to fit your app wireframe. To do this, click on ‘Document’ in the top right of the screen and change the slide size. Now you can create a slide deck by adding all the wireframes in the required sequence. Also, here’s where you add the video as a background element and overlay the wireframe on the video.

Tip: If you have transitions or animations in your wireframes, you will have to create them again in Keynote. These can be tricky to do but you can have a good prototype using this method in most cases without the animations.
4. Testing on mobile screens
A lot of you might not know this, but MacOS’s Keynote can show a slide deck on an iPhone. If the slide size is set to that of the iPhone, the keynote takes up the whole of the device screen. And by tapping on the screen, it would take us to the next slide deck. So, to test our concepts, I acted like interacting with the screen and trying out new hairstyles on myself. This simple prototype allowed us to iterate on different design concepts. If you don’t have macOS to use KeyNote or an iPhone, you can use Google Slides to have a similar effect.
A sample video is shown below. You can also watch our complete prototype video: https://youtu.be/TM59NZ32uzw
Tip: Using methods like bodystorming or role-playing is a good way to utilize the prototypes based on your designs.
The pros
Efficiency
The process is very fast and ideal for prototypes. Every time we wanted to change something in the UI, we could make the changes in Figma, export the wireframes, put them in the slide deck and the prototype was ready.
Contextual
Compared to paper or digital prototypes, this type of prototyping is much closer to the implementation of an AR application. By roleplaying the video in real scenarios, details about the user interactions become obvious which might not be discovered before.
The cons
Scalability
Using this method for a single flow is easier to do as you only need to record the videos once. But, once you start using this method for an application that has a lot of user flows that require A.R. prototype, the list of videos, slide decks to be prepared becomes long very quickly. At that point, you will end up working on video recording and editing more than the design itself. This is a problem we faced as our project progressed.
User Testing
The application we created could not be user tested as it won’t make sense for a participant to look at my face while they are editing their hairstyle. Similarly, in prototypes where the participant's face has to be used in the app, it’s difficult to expand this method to people outside the team. I mean, yes we can record their videos and make the prototype but it defeats the purpose of a user test. Augmented Reality application for objects won’t be limited by this.
So, that’s a technique we developed for prototyping for A.R. What do you think? Let me know in the comments if you liked it or if it can be improved in some ways.