Skip to content

Discovery: eyes-free interaction with GUI is clunky and slow but speeds up with touch

June 19, 2013

Two months hard labour with an iPhone and I’m through the pain barrier!

Main gain is faster interaction because the interface enables me to use my geographic memory skills to a far higher level than a keyboard and tabbed interface lets me.

I think of these touch screen talking interfaces as three dimensional (X Y and Time).

The pain I went through was caused by the step change in manual dexterity skill in my hands and fingers, which had to become automatic before the actual process of operating the interface and linking it all up with my geographic memory skills, all became transparent.

Going through this has made me realise how different the time dimension is when one is using an interface eyes-free and using the interface by eye. This will be obvious to anyone reading this, but what I think could be a new angle on this is the contrast between the cognitive processes going on in a blind user compared with that of a sighted user in any one moment of time.

This matters for people performing user testing for example but whether a cognitive processes comparison is made during these side-by-side testing processes of different user groups is made I don’t know.

Take for example that first moment of picking up an iPhone from two different user perspectives. User A (voice over user) and user B (average screen user). Let’s assume neither user has a memory of what that screen contains at this stage.

User A is exploring by touch, hears voice over reading out icon labels wherever the finger contacts the screen -App store, clock, Game Centre- resulting from touching roughly in the centre and sliding the finger up and to the right, or -Messages, Calendar, Photos- if you start from top left and swipe to move in a standard reading order. It takes a while to read over every icon displayed that is for sure! But there is a positive consequence of this negative, one will prioritise the mental processes necessary to short cut these delays and the advantage of the X Y layout is there is ample potential for using memory skills to speed things up.

The cognitive process to generate a mental model of the screen geography for user A, is akin to how a sighted person will discover a picture as they reveal it on a scratch card, or build up an idea of a landscape by scanning across it with a telescope until they feel they’ve seen enough and felt the movement of the telescope, to build up the image, which they never saw “in a oner”.

For user B they are seeing the entire screen at once. In the same time it takes for user a to hear three icons spoken, user B recognises the overall geography of the display. But this isn’t necessarily an advantage! Their next moment they might be drawn to certain graphical designs over others or they may be trying to ignore the little graphics (some of which are quite random like the Messages icon which is a green box with a cloud thing in it) and instead focus on reading the text associated with each icon. Unfortunately for them, they will see the graphics as well as the text, so their cognitive load will react to this, whether they like it or not, and if anyone measures the impact of icons as positives or negatives is not known to me as I write this. I’m describing a completely different negative here, and perhaps user B is more likely to prioritise the development of mental strategies that let them filter out unnecessary visual details, so they can speed up their interaction / or reduce the cognitive loading.

Now I know both above examples are picked from many possible reactions to the home screen, but I hope they stand to reason and illustrate how the cognitive process for user B is distinctly differently loaded compared with user A.

To conclude this post, , the more I have used the iPhone, the more I have gained a physical geographic memory of where items are located, and the faster and faster I am able to _go straight_ to the item I want in any given moment.

This has resulted in much faster eyes-free interaction with a touch screen computer compared to operating a GUI based computer by keyboard only, because I no longer need to tab through a list of options strung out in time, and suffer the decisions that the designer made when deciding what the tab order should be.

My geographic memory abilities are therefore brought into play by the X Y touch screen interface approach.

I think a side effect of being able to use my geographic memory skills is a regular burst of satisfaction every time I hit or get near the item I was aiming for. This happens constantly and could explain why blind people evangelise the iPhone so much when they get passed the pain barrier in adapting to use it.

As a final note, I know everyone has a pain barrier to get through and I witness my 69 year old mother-in-law with no sight difficulties struggling to gain the hang of an iPhone, but I think it is much harder to use an iPhone in the finger constantly stuck to the screen way that blind people have to use (reduces when memory of layout increases though but it’s chicken and egg) compared to sighted people who use the eye to survey and eye-finger coordination. The latter skill is very likely to be an already highly practiced skill for anyone who writes with a pen or presses little buttons, so this is another thing that blind people won’t have ready to deploy and will need to develop, which could explain the hardness of the pain barrier.

Oh and for anyone who read my posting last year->
“I like my Nokia Smartphone with all its buttons but should I stay or should I go iPhone?”
I have stayed with my Nokia C5 because it is so much more effective as a mobile phone and on-the-move device, compared to the slippery slab of talking glass (which is really what I perceive my iPhone to be) and this is because I can feel when my finger is on the call button on my Nokia without having to listen to the device telling me that my finger is on the call button! In a noisy environment and when I only have one hand to use the device, this really matters.

I am however using my iPhone combined with an Apple keyboard as my new laptop set up… iPhone on the right so I can look around the screen with my finger and keyboard in the middle for when I have to type.

*I have also discovered lots of interesting special key functions on the keyboard too which I could not find documented anywhere and that let me click icons and fields without having to touch the screen…

Advertisements

From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: