Keyboard Input for Virtual Reality

Jonathan Ravasz
UX Collective
Published in
3 min readFeb 3, 2017

--

The third iteration of the keyboard layout enhanced by next word prediction.

Some virtual reality experiences partially rely on keyboard text input. This can be a difficult task for various reasons. The lack of visual feedback of the keyboard and keycaps, and also limiting room-scale VR experiences by a static, stationary physical item. These difficulties motivated me to start working on a new standard for VR keyboard input.

Tilt Brush by Google turned out to be a good tool for sketching various layouts quickly for testing.

I have decided to go with a reduced QWERTY layout similar to the one used for conventional touchscreen keyboard input for saving space. In desktop VR mostly six degrees of freedom (6DoF) controllers are used for interaction. While these controllers allow fairly accurate palm/hand tracking, they don’t have the capability to bring the same accuracy to finger tracking, yet. This limitation results single “finger” per hand typing, similar to the two thumb input of touch screens. The virtual keycaps are positioned slightly curved around the user, making them easier to reach by the controllers. Each row of keys is slightly shifted vertically in order to avoid double key collisions, similarly to the drum-like keyboard of Daydream Labs.

For boosting the typing speed, similarly to touchscreen keyboards, I have decided to implement a next word prediction and auto-complete algorithm supporting the text input in real time. The Levenshtein distance based auto correction is helping the users to correct spelling mistakes, whilst an N-Gram model is offering predictions for the following words. The predicted and completed words appear above the keyboard as text bubbles. The user by selecting these bubbles can easily speed up his/her typing. Currently the keyboard is using a corpus created from conversations of various subreddits, allowing a more general “knowledge”. Although, if one would like to adapt it for a specific topic, for example a sci-fi themed game, a more specific corpus would be more suitable (e.g. movie conversations from science fiction movies).

Diagrams for the Levenshtein distance and N-Gram word splitting.

I am planning on releasing the VR keyboard on the Unity Asset Store soon, hoping it can become an integrated part of various VR experiences.

UPDATE:
Today I released the updated version of my keyboard open-sourced.

Punchkeyboard is an open-source keyboard for virtual reality, enhanced with autocomplete and next word prediction functionality for a swift typing experience. It was created with Unity and written in C#. The built in prediction is based on Reddit conversations, however the repository contains functionality for creating custom dictionaries for personalised suggestions as well.

You can download the demo and the full repository here.

--

--