The Complete Guide For Web VR and immersive experiences

Yuval Keshtcher
12 min readAug 25, 2017

I had the honor of conducting an overseas interview about virtual reality, augmented reality, and mixed reality with Austin Knight.

A UX legend currently working in HubSpot, international speaker, published author, co-host of the UX and Growth Podcast, and mentor at Columbia University. Talking about the future, Austin is doing all of it while working remotely from Brazil.

You should read this interview in case you are one the following:

  • A Product manager
  • A developer
  • A designer
  • A writer
  • A psychologist
  • A VR/AR/MR enthusiast that already knows that the future belongs to immersive experiences

When it comes to virtual reality, augmented reality, and mixed reality, Austin has built a hands-on process for researching and creating immersive experiences using emerging technologies and now he is here to share with us his own personal experience.

So… Join us when you’re ready and let’s dive in.

How does the future of UX looks like?

I believe that VR/AR/MR are the future of UX (especially MR, but that’s another conversation). Knowing this, I wanted to learn how to design and develop for it.

VR is the most matured right now, but it’s also early enough to where UX patterns are still very much being defined. For example, when I had Casey Yee of Mozilla VR on the UX & Growth Podcast , we discussed what a link could look like in VR.

So not only would I be learning how to design and develop for it, but I would likely be forging the UX patterns myself and perhaps helping to define the medium.

Why should UX designers get into building immersive experiences today?

Because it is an incredible opportunity, that brought me back to the early days of the web, where everything was kind of wild and still being defined. With that in mind, I quickly determined that I wanted to experiment with VR, and more specifically WebVR because it was fully accessible; all users would need was an internet connection and a device. And that device could be ridiculously simple, like a cheap Android phone or a laptop, or ridiculously complex, like an HTC Vive. It would work for everyone.

This also meant that anyone could build for it. I work remote and travel around the world, and often times, that means I don’t have access to the latest technology, meetups, etc.

What kind of skills should I pursue to start building immersive experiences in VR/AR/MR?

It’s best if you know some HTML, CSS, and especially Javascript. You’ll feel like a wizard once you get the hang of it. Here are links for those tools: A-frame and MagicaVoxel .

At some point, you’re probably going to need some help (just as I did), so I would rely heavily on the A-Frame support docs , the A-Frame slack , and the WebVR slack . For inspiration, check out Google’s WebVR experiments . And you can also take a look at the WebVR site that I built.

If you want to learn more about UX & VR there are a ton of resources here.

What tools does WebVR UX designers are using this days?

Personally, I would start with A-Frame and MagicaVoxel. Do some basic modeling in MagicaVoxel and import that into A-Frame. Then work on creating animations, interactions, etc. in A-Frame.

There are a lot of WebVR frameworks out there, and they vary drastically in their use cases and complexities.

So while A-Frame and three.js have you working directly in the code with a lot of control over the experience, tools like Vizor or Hologram or Playcanvas allow you to use a fully visual editor, but with less control. And then you have other sort of hybrid tools like Sketchfab .

With my sights set on WebVR, I began researching potential avenues that I could start with.

I eventually zeroed in on A-Frame for its widespread adoption and documentation, ease of use, and connections with Mozilla and the open source community. The natural first step was to spin up a boilerplate and start messing with it (or in my case, breaking it). This has always been how I’ve learned; not by watching tutorials or taking classes, but by doing. A-Frame’s boilerplate is incredibly simple .

Beyond this, A-Frame has it’s own unique version of Inspect, but for VR, which can be accessed by hitting CTRL + ALT + I on any A-Frame project.

This brings up a sort of visual editor, so I could play around with elements on the boilerplate, as well as with elements on other live A-Frame projects around the web. This helped me to understand A-Frame’s ingenious entity-component structure, which looks and feels a lot like HTML, even though it isn’t.

How did your first WebVR project looked like?

When I was familiar enough with A-Frame and how it worked (honestly, a lot of the “training” that I had to put myself through was just verifying that things were actually as simple as they seemed, because I couldn’t believe it; I couldn’t believe that such simple code could create something so incredible).

At that point, I needed to determine what I wanted to build and how I would build it. I do a lot of speeches and host a podcast, and they’re often in different places around the world, so not everybody that follows me can attend. So I landed on creating a place where people could watch my speeches and podcast in VR, as if they were in the room with me. This meant that I would need to create a sphere to attach my 360 videos to, and then place some elements in the sphere for navigation and interactions.

I began with A-Frame’s detailed documentation and created a sphere. Then I added a few 3D primitives generated by A-Frame and used Magicavoxel to create some more complex 3D elements, all of which would live within the sphere and function as the user interface (but in 3D space).

What is your current WebVR UX process?

It was when I started to learn Javascript that things got really great. Throw a few Javascript events into an A-Frame scene and it fully comes to life. At the end of it all, I launched my VR experience and thousands of people from all around the globe were using it.

But at the same time, in user testing and analytics, I saw that users weren’t utilizing the full 360 area of the experience, and this had also been observed in other products. So I developed a tutorial that would introduce my users to WebVR, while simultaneously getting them to complete a 360 revolution.

It would introduce them to the nature of the experience, both through talking to them about it, and through literally walking them around the sphere. That was when users started really taking advantage of the experience that I created, and it was then that I realized I had just applied my entire UX process to a VR project.

Highly satisfying, transformative from a design and growth perspective, and way easier than I expected it to be. Perhaps best of all, I then used this as a proof of concept to launch VR initiatives at HubSpot.

What are your conclusions after doing UX for WebVR?

It taught me a lot about how UX would work in VR.

  • First, there is the realization that we now have a completely new element of design to work with. While designers would classically use combinations of Red, Green, and Blue, VR introduces a fourth element, which is Depth. And that can drastically change what is possible in a design.
  • Second, you realize that whether the user has 3 Degrees of Freedom (just the ability to look around) or 6 Degrees of Freedom (the ability to look around and walk around) can have a dramatic impact on how you construct the experience (I ended up making one of each).
  • Third, you come to terms with the fact that, despite sound classically being a “no-no” on the web, it is critical to a VR experience.
  • Fourth, you realize that the best interfaces are actually integrated into the environment, rather than attached to the user (like a HUD).

How do you see the future of User experience?

I believe that the future of UX is largely in MR.

It is all part of a natural evolution. We’ve seen design go from a single desktop experience, to multiple desktop resolutions, to multiple devices with multiple resolutions, to devices that may not even have a screen in the traditional sense, and so on.

Through this, we can observe that users aren’t actually addicted to their devices; they discard them and move on to new and better ones pretty easily. Rather, they’re addicted to the information and experiences that those devices provide. And once that information can be integrated into our real world, the need for computers, smartphones, and 2D interfaces in general will nearly vanish. In the same way that we’re seeing a mix of people with backgrounds in design, computer science, psychology, marketing, research, etc.

What type of professionals would hold This kind of UX positions?

serving as UX designers, we’ll see them filling roles in MR. But contrary to the idea that’s being perpetuated around UX Designers becoming obsolete , I would suggest that the industry as a whole will simply evolve, just as it always has. What it means to be a “UX Designer” will change; and it will change dramatically with MR.

Individuals with experience in 3D space, sound design, embodied cognition, spatial processing, cognitive load, proprioception, human factors and ergonomics will all have massive advantages. Hence, why I began this project in the first place.

Does UX designers have to worry about the future?

Encouragingly, this project really allowed me to observe that while the technologies and mediums for design will change with MR, the principles will largely remain the same.

If you’re a good UX Designer on the fundamental level, this transition is going to feel pretty natural for you. You just have to be open to it.

Do you see new emerging technologies such as deep learning combining with VR/AR/MR?

Emerging technologies in general, albeit appearing like toys on their own, become extremely impactful when placed in the context of MR.

Deep Learning is a great example for that. in fact, Deep Learning and Quantum Computing will likely be the most critical elements for widespread MR growth and adoption.

Being able to scan an environment and know so much about it, and then tailor the experience based off of it — that’s some powerful stuff, and that’s what makes MR feel so real. Imagine looking at your laptop right now and seeing a spider crawl over the top of it. The device is real, but the spider is fake. But you can’t tell the difference.

Your MR device identified everything in your environment and mapped the physical attributes of your laptop and your desk (this already happens with Hololens — it creates virtual meshes of your environment instantaneously). It observed the lighting in your room and cast it on the spider.

Maybe it even noticed that parts of your laptop were hot or had open holes and made the spider avoid them. Nothing unordinary or that would reduce immersion happens, because the MR device is operating in nearly the exact same environment that you are, but its environment is just a highly accurate virtual replication of yours.

What kind of business opportunities would it bring?

The business opportunities are going to be really crazy when think about it. Oh, your coke can is empty? Cool, we’re going to automatically order some more from Amazon. Oh, the rear facing camera in your glasses noticed that your pupils tend to dilate when you look at the new Mercedes C Series.

Maybe Mercedes should start sending you ads and discounts, you know, just to push you across that buying line. Wait a second, that girl across the room just fell on the floor and is convulsing. Your device immediately identifies the issue, alerts medics with a real-time report on the situation, and then displays instructions for what you should do to help her until they arrive. How about that new ceramic vase you were thinking about getting. You can’t tell if it would fit well in your room.

Well go ahead, take a look. You’ll be able to see the vase exactly where you’d place it, to scale. And you can walk around it, move it, anything. Wayfair is already doing this .

And, if you think about it, this starts to beg the question: if we’re all wearing these devices, would you even need a real vase at all?

Is there any useful WebVR communities?

My mind was being blown at every turn. And whenever I came up against a problem, I would either refer to A-Frame documentation or Google it and find an answer on GitHub or Stack Overflow. And if I really got into a bind, the good people at the A-Frame slack and WebVR slack were there to help.

That’s perhaps the greatest benefit of building in WebVR: the community is extremely passionate and helpful.

What talks do you recommend for the readers to listen to?

I would really recommend giving a listen to the episode of the UX & Growth Podcast where I had Casey Yee of Mozilla VR on. He was one of the co-creators of WebVR and we had a great, in-depth discussion about UX in VR.

I especially enjoyed this talk from Josh Carpenter, which is already a couple years old, but still very relevant today:

User defenders podcast talk on the future of design with Austin Knight part I.

User defenders podcast talk on the future of design with Austin Knight part II.

Changing the future of education with virtual reality.

Changing the future of how we communicate using virtual reality as an empathy machine.

If I am currently looking for a job in UX & VR, where should I search?

If you want to learn about getting a job in VR, I’ve found this to be one of the best resources

What people should the readers follow if they want to improve their skills in the Web VR medium?

  • If you want to follow some good people on Twitter, I recommend checking out:
  • @kentbye
  • @jerome_etienne
  • @mano1creative
  • @helensitu
  • @donrmccurdy
  • @scobleizer
  • @whoyee
  • @joshcarpenter
  • @punchesbears
  • @mxweas

You can also check out this list.

Now It’s Your Turn!

The field of UX of VR/AR/MR is still an enigma that has to be solved. No one can possibly imagine how are Immersive experiences are going to look and feel like in their complete form.

Which means that today as UX people and designers, we have a huge responsibility. The immersive experiences we are going create today are going to shape the future of the world as we know it tomorrow, for better or worse.

Thanks to Austin Knight, now we have the opportunity to get our hands on it and spec and design our own Immersive experiences without further excuses.

Give it a go for yourself. Trust me, you are going to enjoy the process.

Don’t forget to share with me your creation process by tweeting at me 🥑

--

--