How to improve your mobile UIs by learning from personal touch patterns

Daniel Buschek
UX Collective
Published in
7 min readJul 2, 2019

--

There is a surprising complexity to the daily dance of your fingers on your phone. This story reveals what’s hiding in your touch — and how you can use this knowledge and machine learning to build better mobile touch interfaces.

Over the last six years, I’ve studied people’s everyday touch interactions with smartphones in great depth. What fascinates me the most is the rich hidden structure and individuality of human behaviour, which I found even in the trivial action of pressing a button. We’ll reveal and utilise these patterns here.

A deeper understanding of touch

To improve mobile GUIs with a deeper understanding of touch we first have to examine touch errors: These offsets are the little shifts between intended target and actual touch point, as shown in the figure.

A touch offset (or touch “error”) is the shift between the user’s intended target (here: the crosshair) and the actual location at which the user has hit the screen.

If you’ve ever tried to send a message without typos on a shaky bus or train ride, you know that humans are not pixel-perfectly accurate touchscreen operators. Our touch offsets result from body movement, thumb stretching, hand-eye parallax, small button sizes relative to the finger, and other factors.

Revealing personal touch behaviour patterns

We discover insightful behaviour patterns by looking at offsets for many touches across the screen. The next figure plots what such a dataset looks like. It also shows an offset model, that is, a regression model fitted to offsets (for technical details see this paper).

Capturing and modelling a user’s offset pattern: Touch offsets are collected over multiple targeting actions, e.g. while typing, using buttons in apps, etc. A regression model is fitted to this data to map touch locations to intended target locations. Visualising this mapping as arrows reveals the user’s offset pattern.

The fitted model has learned to map touch locations to intended target locations, shown by the little arrows in the rightmost plot.

It is insightful to take a look at these patterns for multiple people. For visual clarity, we now use colour plots instead of the little arrows. The colours in the next figure show horizontal offsets, that is, how far to the left or right of the targets people tend to touch.

Offset patterns showing horizontal offsets. The data is from a user study in which participants had to hit crosshair targets, using the thumb of their right hand.

We learn more from looking at many such patterns: For example, most people using their right thumb tend to “undershoot” targets near the left screen edge to minimise thumb stretching (blue areas in the plots). In contrast, they tend to “overshoot” targets near the right screen edge, since bending the thumb too much is uncomfortable (red areas).

Moreover, the fact that no two patterns look exactly the same reveals the individuality of touch behaviour. These differences between people are due to varying hand and finger size, targeting strategies, personal tradeoffs between speed and accuracy, experiences with touchscreens and the device, and so on.

In summary, people show highly individual offset patterns. Hence, these patterns are key to building “intelligent” GUIs that adapt to individuals. We’ll look at how you can build such GUIs next.

Building touch GUIs that improve themselves

Let’s highlight this important use-case for offset models right away:

Touch offset models enable us to build mobile GUIs that learn and adapt to the user’s individual input behaviour.

How? Using a touch offset model, we can build a system that predicts the user’s intended location for each touch. Shifting the touch location according to the model’s prediction increases accuracy and reduces errors. The figure shows how to use the model to achieve this.

Improving touch accuracy with an offset model: Upon touching the screen, the model is used to predict the intended location from the actual touch location (red arrow). If the model works well, shifting the touch location according to the model’s prediction brings it closer to the target and thus improves accuracy.

In this way, offset models can significantly improve both finger touch and stylus touch, as shown in studies in the lab as well as in everyday life.

Typically reported improvements in accuracy, hit rate or error reduction are at around 8%, depending on context. This is relevant, considering the large number of touches many of us perform every day. Related approaches are already being successfully used in keyboard apps. You can learn more in my article on keyboard adaptation.

Moreover, note that the bottleneck for accuracy on modern touchscreens is no longer hardware sensor resolution but rather human finger accuracy. Hence, touch correction with offset models provides a software-based approach to further push the limits of input precision on touch devices.

Practical insights for deploying touch correction

It can be challenging to get machine learning methods to work in interactive applications used in rich contexts in everyday life. For deployments of touch offset models, key considerations include:

We first need to collect touches to train the model. The literature points towards 10 to 300 touches, which could be collected in an enrolment step or game (e.g. see this paper) or during use (e.g. typing). It’s also possible to ship an app or OS with a model pre-trained on data from other people. This avoids setup efforts for the end-user yet it likely will be less effective than a model trained for that user specifically.

Touch behaviour is also shaped by dynamic factors such as movement, left vs right hand use, thumb vs index finger input, etc. To improve touch correction, we may thus consider several strategies: We could update the model regularly, restrain it to make “cautious” predictions, or integrate a classifier to identify, say, thumb vs index finger use, to then apply the appropriate offset model. I won’t go into details here, yet promisingly touch data itself can help to do just that (see e.g. these papers).

Informing mobile GUI design with offset models

If you work in UI design or development, you can benefit from offset patterns even if you don’t integrate offset models directly into your app.

We can instead use offset models at design time: For example, we can run them “in reverse” to predict likely touch locations given a GUI element’s location. In other words:

Touch offset models allow us to simulate touch behaviour to guide UI design.

As an example for this, here we use offset models to estimate suitable target sizes for different screen regions: For each pixel, we predict where a touch would likely occur if a target was centred at that pixel. We then measure the distance between that predicted touch and the target centre, to see how big the target would have to be to accommodate this touch.

I used such an analysis to derive the target size guidelines shown here.

Example of deriving GUI guidelines from offset patterns: The plots show the recommended minimum diameter of circular buttons, based on predicting expected touch locations with offset models. Models are trained on data from our study with 28 people using a Samsung Galaxy Note 4 phone. People used their right hand for input (index finger or stylus), both while sitting and walking.

Keeping in mind that people used their right hand for input in this study, the figure reveals several insights:

  1. Reaching matters even for index fingers (not just thumbs): Targets in the far corner (i.e. top left) need to be larger to improve reachability and avoid mistouches due to “undershooting”. Users likely minimise the efforts of reaching movements here.
  2. Utilise focus areas for accuracy-demanding tasks: An area slightly right and down off the centre is great for highly accurate tasks (e.g. map point selection, drawing, image manipulation). A stylus excels here particularly.
  3. Stylus vs finger — not as simple as previously thought: In line with common knowledge the stylus is more accurate than the finger. However, comparing the plots above reveals that the extent of this advantage is not constant across all screen regions. This could inform UIs that support both finger and stylus input, maybe even bimanual input.

A disclaimer: We should not overly generalise the above figure as a fixed “design rule” since the particular patterns were derived from data of a single device. We could include more data to compare devices, usage contexts, etc.

Nevertheless, these target size charts and discussed insights illustrate how touch offset models can guide mobile GUI design.

Takeaways

Humans don’t touch targets pixel-perfectly on mobile screens. The little shifts between your target button and your finger are called touch offsets. In this article, we’ve discovered the behaviour patterns and use cases that emerge from studying these offsets in detail. We’ve learned three things:

  1. Touch offset patterns are highly individual, almost like a behavioural fingerprint of the user. Hence, personalising touch handling can push the limits of input precision.
  2. Offset models allow us to build GUIs that “improve themselves”, namely by adapting to the user’s individual targeting behaviour to increase touch accuracy and reduce input errors.
  3. Offset models enable us to simulate touch input to guide UI design, for example, to inform the placement and size of GUI elements.

In a broader view, this line of research illustrates how personal patterns in input behaviour can be utilised to improve UIs — and that such useful patterns may be found even in seemingly trivial actions, such as tapping a button on the screen.

I’m curious to hear your ideas for using touch predictions in GUI design or for GUI adaptations during use. For more insights from research on intelligent and adaptive user interfaces, follow me here or on twitter.

Want to learn more? Check out my article on how to personalise your mobile keyboard.

Liked the plots? To train and visualise your own offset models, check out the toolkit & data we released here.

--

--

Professor at University of Bayreuth, Germany. Human-computer interaction, intelligent user interfaces, interactive AI.