Extended Reality (XR) explained through the 5 + 1 senses

Extended Reality (XR) is the umbrella term for environments that combine the real and virtual. This includes Virtual Reality (VR), Augmented Reality (AR), and Mixed/Merged Reality (MR).

These technologies have been advancing at exponential rates, setting the stage for an endless range of interactive and immersive experiences. XR can pull you in multi-dimensional directions with endless opportunity to create unique and innovative experiences. To help illustrate this we will explore these technologies as they relate to our five basic senses of sight, hearing, touch, smell, taste, and the potential for a “sixth sense.”
“A mind that is stretched by a new experience can never go back to its old dimensions”
— Oliver Holmes, Jr.
1. Sight
This is probably the first sense that comes to mind when thinking of a sensory experience in XR. AR technology primarily focuses on what can be seen, as its main function is to overlay computer-generated information onto the real world. Pokémon GO is a popular example of this technology, enhancing the experience of reality by combining it with a world of Pokémon.

VR, on the other hand, allows for a full immersive experience in digital worlds. Fictional settings, like seen in many video games, are popular VR retreats, created through 3D modeling, animation, and programming. Through the same technology, one can experience a place that mimics the real world, like GTA’s Vice City, which is based off the real city of Miami. With 360° cameras, one can even immerse in relatively accurate representations of the real world. This is popular in the real estate industry with 360° virtual tours. VR experiences can take you many places, only a headset away. There are several VR headset options from companies like Facebook (Oculus), Sony (PlayStation), and Google (Daydream). Headsets can use rotational or positional tracking. Rotational limits the user to a viewpoint and location, like in a 360° video, while positional allows the user to roam more freely in a simulation, creating an even deeper immersive experience.

MR combines AR and VR technology to allow for visual experiences that merge computer-generation with a real surrounding environment. You can experience a digital simulation, like you would in a VR application, but placed in your true surroundings, like in an AR application. Spatial mapping technology can be used to generate a 3D map of the real context to use in conjunction with a digital application. MR experiences are obtained through MR headsets such as the Magic Leap 1 and Microsoft’s HoloLens 2. Below are snippet examples of varying MR applications that Magic Leap created for oil and gas company, BP.
2. Hearing
Hearing is the auditory perception of sound. Immersive audio is a more advanced 3D version of audio. The same way a 360° videocamera can capture an experience from all directions, 360° audio provides non-linear sounds that mimic how we hear things in real life. It will match the location the sound is coming from and come from that direction. If you are in a VR simulation and a band is playing on the right, you will hear the music coming from that direction, and as you walk closer to the band, the volume will increase as it would in reality. This allows for more accurate and immersive sense perception in XR applications through the use of sound.
The combination of audio-visual sensations could arguably be the most popular. These experiences are seen in AR, VR, and MR. One example is through videos. While we have been exchanging videos with one another for a while now, 360° technology has allowed us to share even more immersive memories. The video below is an example of an audio-visual VR experience of the Tomorrowland music festival.
3. Touch
Touch is the sensation derived when coming in contact with an object. While you may not actually come in contact with real objects in an XR experience, wearable devices, such as gloves and bodysuits, are used to simulate the perception of touch in virtual environments. This is achieved with haptic technology and feeback through forces, vibrations, and motions. You may have experienced basic tactile haptics while using a steering wheel controller in a racing game. The wheel uses vibrations and resistance to try and emulate the real world experience of driving a car.
Wearable technology can allow you to understand textures in digital environments. Force feedback can be used to understand an objects size and weight. The specific mechanisms and technology used to emulate touch varies by manufacturer. For example, the HaptX Gloves uses its own patented microfluidic technology that combines tactile feedback, force feedback, and motion tracking. Tesla’s bodysuit, Teslasuit, uses complex haptic feedback along with features such as Electric Muscle Stimulation (EMS), climate control system, motion capture, and biometrics to allow for unique immersive sensations. Biometry is interesting as it uses realtime data to relay information such as heart rate. This allows for the incorporation of emotional and stress levels within an experience. Wearable devices have multiple applications from creating experiences to analyzing performance. They can be used in sport enhancement, workforce training, and public safety simulations.
4. Smell
Smell is the perception of odors. Simulating scents is relatively new in the XR space. There are a handful of companies dabbling with this technology, and FeelReal is one of them. Currently on pre-order, their multi-sensory mask is a device that will be able to snap on to several popular headsets. It incorporates some touch sensations through vibration, water mist, and micro heaters/coolers, but uniquely aims to allow for the sensation of smelling. It uses patented interchangeable scent cartridges that mimic hundreds of smells. It claims to create a range of odors from natural scents like coffee, relaxing scents like lavender, to realistic scents like burning rubber.
5. Taste
Taste is one of the most difficult senses to recreate as it is derived from a chemical reaction on the tongue. Scientists and entrepreneurs are racing to emulate this sense in the XR space. While still in the experimentation phase, electrotherapy may be used to achieve this. There is potential to place electrodes on the tongue to emulate the perception of flavor. Scientists can manipulate heating and cooling to achieve this, or even electric muscle stimulation to create the perception of hardness or chewiness.
However, taste is an interesting sense. It is one that is heavily influenced by many other senses. Robin Dando, professor of food science at Cornell, describes how “when we eat, we perceive not only just the taste and aroma of foods, we get sensory input from our surroundings — our eyes, ears, even our memories.” Designers can get creative and uniquely incorporate tasting experiences in XR, without the need to recreate them. Project Nourished has been playing with this idea, using smell to help allude to memories of taste. They create 3D printed “food” made from algae that have aromatic diffusers to release desired scents. The James Beard Foundation approached this differently with their recent debut of Aerobanquets RMX, an XR experience that uses art to enhance and redefine dining. Instead of emulating taste through smell, artist Mattia Casalegno, asked questions like “how can you give color to a flavor?” and “what shape is taste?” to create visual experiences to enhance the flavor of the food.

6. Bonus: “Sixth Sense”
A sixth sense is an intuition that gives awareness, usually not obtained in normal perception. Machine learning and artificial intelligence (AI) can play a role in XR experiences, acting as an advisory sense. Take the medical field for example, where AR is being used by professionals. During practice, doctors can visualize computer data, such as from MRI and CT scans, to analyze patients without having to go beneath the skin. When these devices become embedded with AI capabilities, backed by large amounts of data, doctors will be assisted in making instantaneous life-saving decisions, such as providing the correct medicine, or performing the correct procedure.

The crayon box approach
Think of all of these technologies as crayons in a box. While all the colors can be used at once to create a picture, only a few may be needed depending on the vision. XR technology is being used in a wide range of fields from the non-profit to the medical. It’s an exciting world to follow with new solutions, technologies, and cross-applications constantly coming to life. Multiple-sensory experience outlets also allows for better design in regards to accessibility and inclusion. For example, 360° audio and haptic wearable technology can be used to design immersive experiences for the blind and visually impaired.
You may be familiar with the saying “a picture is worth a thousand words,” or “a video is worth a million.” So, what becomes the value of an XR experience? How much experience and knowledge can we share through these tools? Increasing our shared experience base, will deepen our knowledge base, and continue to propel innovation. While knowledge is finite, imagination is infinite. Creatively combining XR technology with our imagination will only continue to open new doors.
“Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.”
— Albert Einstein