Smart UX

These are the levers to open subway car doors:

If you pull them during the move, nothing happens, because they are locked for safety. Also, no indication to you as a user will be displayed. Also, there is no indication about whether they are currently locked or unlocked. If you are an experienced subway car doors user though, you can hear the sound of unlocking and then understand that you can now pull the levers.

And this is the button to open doors of the new subway car:

They have added a visual indication of whether the doors are locked or unlocked, the LEDs around the button. But there is more in it, the feature that I would argue belong to the new generation of the Smart UX, a combination of Smart Data and UX.

If you look at the old lever system from the data perspective, the user is giving you the signal about his intention to open the door. If the door is locked, safety of the user is ensured by not opening the doors, but also his signal is not used and it gets lost.

As for the new button, you can push it also when the car is moving and the door is locked. The door will use this signal to understand your intention, to remember it, to indicate to you using a special combination of the lighting LEDs that it knows you want to exit, to wait until the car stops at the next station, and then to unlock itself automatically.

Not only the Smart UX is more comfortable for the user because it reduces the amount of interaction required to open the doors, as well as the amount of knowledge about the current lock state required to be able to operate the doors successfully. But also the robotic system shows more respect to its human user by acknoledging his presence, remembering that the user wasn’t successful at the last interaction, and being polite and helpful by trying to predict his next actions.

I’m not saying that the new button is in any aspect better than the old levers. It could be that the levers are easier to understand by people who maybe arrive from 3rd countries, never have seen round buttons before and don’t come to the idea that they can be pressed. And the levers are more similar to usual door handles and thus easier to understand by that category of people. Also, in case of a malfunction, the doors have to be unlocked and pushed aside manually and mechanically – operating the levers could be easier than trying to open the doors without any handles.

What I’m saying is that a modern, Smart UX considers any user interaction as a data signal, exposing the true user intention, and it doesn’t waste any single bit of this signal, but uses it to build up a mental model of the user intention and to act according to it.

Leave a Reply