Subway for the blind

How the design I’m most proud of is the one at which I have not created a single pixel.

Bartek Jagniątkowski
UX Collective
Published in
7 min readJul 13, 2018

Back in 2015 I quit working in advertising.

Willing to fully commit to UX design I focused my search and found a company building a product I had no idea about, so I found it very fitting to try my luck there. And it is, up until this point, one of the things I’m most proud of, and — paradoxically — one at which I have not created a single pixel.

Back in 2014, Warsaw’s city council started a project called ”Virtual Warsaw.” The primary goal of this project is making the Polish capital more friendly and accessible. It consists of many smaller projects addressing different issues throughout the whole spectrum of problems, with focus on helping blind and vision impaired people move about the city. One of such projects was a small city game in and around Warsaw’s Centrum Nauki Kopernik (Copernicus Science Centre) and Multimedialny Park Fontann (Multimedia Fountain Park) which included navigation between different sections of the game in those places. Since the CNK location is near a subway station, there was a need to navigate to, in, and out of that station as well.

A small disclaimer: initial project’s scope and requirements were something I wasn’t part of, unfortunately, so I can’t truly comment on that and take any responsibility. All I can do is present the part I took in this project.

While you can make far-reaching assumptions based on experience of a regular user (”good practices”, market research, A/B testing, usage analytics, etc.), for groups of users with unconventional methods of interacting with their devices (physical or mental disabilities, non-standard hardware extensions) in-person contact and the unfiltered feedback is crucial.

When I started working on the project, it was in an advanced stage of development, but it was getting more and more apparent the initial assumptions were not working in the real environment making the experience more troublesome and confusing instead of simple and straightforward. Beacon-based navigation used in this project was not precise enough to sufficiently support a blind person. People with normal vision were able to verify erroneous notifications from the system: it’s pretty obvious you would ignore a suggestion to ”walk straight ten meters” when facing the train tracks. For a blind person, such a situation might have grave consequences. Additionally, a complicated calibration process to determine a direction the user was facing was very cumbersome and failing at critical moments. So it was a matter of smoothing out those imperfections and potential errors in user’s positioning and make them more helpful and usable.

Together with the team, we started thinking and searching for a suitable solutions, keeping in mind the main idea, time frame, and technical limitations of the already established infrastructure (beacons and their management system). This way we came back to one of the previous ideas never fully explored. Before we started working on this implementation, we needed those new theories put to the test. While doing research, we stumbled upon a project named Wayfindr which seem to be going a very similar route.

Wayfindr Open Standard / wayfindr.net

The solution chosen by Wayfindr gave us validation we needed at this stage. With the base idea in place, we thought of tracking the user’s position using zones instead of points. From the system’s perspective the change was in the logic: instead of following a specific target in space we needed a history of this target’s visits and if there was no history we build the route based on the starting zone.
Knowing the user ”appears” in one of the zones on the platform (meaning there’s no history of this user’s visits to other available zones) we assumed that this user arrives on a train, is currently on the platform and wants to get out of the station. By analogy — if a user ”appears” at the entrance/exit zone (with no history from other zones) we could safely assume this user wants to get to the platform and on to the train. So as a starting point we chose thise two scenarios which we thought were the most common situations:

  1. The user arrives by train, is on the platform and wants to leave the station.
  2. The user enters the station from the street and wants to get to the platform.

Because we were building a testing stage with a solution mainly for blind people we weren’t thinking about situations where a new user appears in the middle of the planned scenario — if you’re a blind person you would most probably use the application the whole time, not turning it on and off.

A typical solution for the first scenario is ”I want to get out of the station using the exit I prefer” (if there is more than one exit). For the second — ”I want to get to the platform and board the train going in the direction I need” (the station we were running the tests at has only one line running through, in case of more lines the additional element was the ability to select a specific line).

Next we needed to decide how to divide the station into zones. What we wanted were the critical elements of the station’s interior (escalators, elevators, security, toilets, etc.) to be included in a specific zone and its description written using the most straightforward language, yet including as much information as possible.

After we drew the zones on paper, we had to write down various scenarios for possible routes the users could take walking to and from the platform (including variations with trips to the toilet, using elevators, etc.). Without the direction of movement received from the device, we had to come up with an idea how to determine that direction, even if at large approximation.
With the history of zones — which zones the user entered, which zones he left and in what order — we thought we should be able to determine with a quite substantial probability the direction the user was taking and estimate the target, which in turn helped us send to the user information needed at specific moments.

After the scripts were ready, they needed a trial by fire. We met with a group of smartphone users from Polski Związek Niewidomych (Polish Blind Association) that were able to help us. We ”became” our application, reading the notifications the actual app would send to the user at a specific zone and following a particular scenario while walking with our testers. Observing their reactions and listening to their comments we took notes we later incorporated into our scripts: what does and does not work, what are their comments regarding the specific language we used, what is utterly useless, what needs to be changed or rewritten, what can be cut away without losing the functionality and shortening the message to a bare minimum.

Because it quickly became apparent one of the main errors we made initially was to try and explain the area with too much details. We turned our messages into essays, which were wasting the user’s time while they had to listen to the whole story instead of moving forward. It was somewhat an equivalent of an interface animation that’s just too long, and you can’t skip it or do anything except wait for it to finish.

Then we had to incorporate changes and fixes made after the initial tests, create the zones and mechanics inside our system for the mobile app to use, and then run another series of tests, this time using the actual device.

The last phase of the project was a final clean up of the messages’ contents and testing the application without any help or support, just the blind user with their device.

And that’s where my involvement with the project came to an end.

After all the work and tests were done I had two observations.

One:
it was surprising how well blind and vision impaired users operate the touchscreen devices and how frustrated they get when they are using the software that’s not tailored to their needs. Once again it turned out how many unreasonable assumptions, bias, and ”universal truth” with no merit we carry with us. And how wrong it is to rely on those ”truths”.
It’s a truism, of course, but keeps being forgotten all the time: you need to listen to your users. And since I was a freshman here I had a lesson to learn.

And second:

if this application was to have only one function that was most useful to the blind, it would be a message informing the passengers coming down the escalator to the platform, on what side of the platform the train leaves in which direction.

That’s how after building the whole thing the MVP for this product was redefined— something that was supposed to be a small element of the system turned out to be it’s most valuable part to which we paid little attention.

The Human Factor / Mayors Challenge 2015 Learning

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Written by Bartek Jagniątkowski

Philosopher / Mentor / Thinker / Lecturer / Writer / Painter / Best dad in the whole world / https://jagniatkowski.net

No responses yet

Write a response