Hand-based VR/MR interaction for deleting entities
Design and Prototype work exploring the hand interaction of deleting distant as well as nearby digital entities in Mixed Reality.
Deleting an entity or closing an application is one of the most ubiquitous operations performed in any application. It is necessary for the organization of the data. On the computer, there are multiple ways to delete a file like `cmd + delete`, delete key, or dragging the icon to the trash-bin. While on mobile phones, pressing delete icon, side-swipe gesture, or long-press to enable the delete option. All these interactions are specially designed and defined for respective form-factor of the devices.
It is noticeable that those interactions mentioned above are unable to provide a viable interaction model in mixed reality. Unavailability of such gestures opens up the opportunity to explore new interactions for deleting entities in the mixed reality. This prototype tends to explore one of the many possible ways to delete an object in the space
Currently available methods:
- Deletion operation mapped to the controller button event
- Throwing object
- Ray-casting pointer with controller click on Close icon
Inspiration:
- Iron Man 2
- Toddler Throwing Things
Possible Interactions
- Delete button on the contextual menu: Controller Mapping
- Throw away
- Finger gun
- Abstract stretch till it deforms and breaks
Prototype Breakdown:
For throw-to-delete interaction, I prototyped a behavior of selecting far away object, bringing it closer to arm’s reach for any operation, returning the entity, or throw-gesture to delete the entity. Summoning an entity from the list of elements like emails, immersive project reports, or just a widget, ability to inspect or operate with it, and then can either dismiss it by deleting or sending it back to its original location can be some of the use cases. For example, archiving an email directly from the list or closing off a widget isn’t required anymore.
Following three stages make up the full prototype:
- Selecting/ Picking
- Throw -To- Delete
- Returning
Part 1: Selection/ Picking
MRTK- an open-source toolkit for spatial computing is equipped with modular and performant code blocks to speed up the prototyping process. For this prototype, I used MRTK framework.
Adding ObjectManipulator to the GameObject, one can easily interact with the object for various operations like selecting, clicking, far/near grabbing, scale, or rotate. Based on the requirement for the application one can assign appropriate settings. For the current prototype:
- Physics Behavior on the release is disabled for better control on the entity to classify if the entity is selected or is being thrown.
- Added TransformContraint to limit scale operation.
- Assigned operations like clicking and grabbing to the entity.
MRTK’s pointer design enables bringing a grabbed entity to arm’s reach by pulling-back-gesture performed when an outstretched hand is pulled back to folded pose. The speed of the entity movement is dependent on how fast a hand is folding or unfolding.
Upon picking, this prototype shows visual feedback of hover light on the object and provides audio feedback confirming the selection or manipulation start event. It also renders audio feedback upon releasing the object, acknowledging the conclusion of interaction.
Part 2: Throw -To- Delete
Once having direct access to the entity, let it be a far selection, or in the arm’s length, we want to carry out the deletion operation. A typical everyday gesture is to toss things away.
Upon grabbing the entity, it is easy to compute linear and angular velocity by taking position samples from the previous frames. On release, depending on the entity’s velocity, I classify the operation as deletion or summoning to the user’s arms reach.
In other words:
- Check the magnitude of the velocity (Above threshold or below)
- Check the direction of the velocity(towards the user of in any other direction)
The above checkpoints help to classify the deletion operation or summoning operation.
The PhysicsReleaseBehavior in the MRTK is flawed for far-manipulation. By Inspecting the MRTK scripts, the velocity passed to the grabbed entity upon release is the controller’s or Hand’s velocity, which can vary from the grabbed object’s velocity considering the distance between the entity and the controller. One can look the gif below where the MRTK’s relese behavior looks unnatural.
Part 3: Returning
Upon deciding how selecting, summoning, and deletion operations work, we needed to explore an ability to return the entity to its place.
Like folding hands closer to shoulder is a gesture to bring an entity closer, the gesture of unfolding hand extending it away from the shoulder is a gesture to give or return any entity. It’s a mental model to elongating hand to return, place it back, or give.
Inspired by Harry Potter’s magical effect, I decided to implement the gesture mentioned above and animation to place the entity back to its original place. Thus when the hand is stretched outward beyond a specified threshold from the connecting shoulder point, and upon release of the entity, it travels back to its original location.
This returning gesture does interfere with the throwing-to-delete gesture, as the hand may or may not be extended. Thus to classify that, the velocity of the object is weighed again as a threshold. Eventually, by holding the grabbed entity at arm’s span, visual feedback appears describing the return track. Releasing the entity, if the path is visible, it returns to the original position.