People like me who listen to web content instead of looking at a mass of stuff on the visual screen, know that instead of the web looking packed with info and images, it is in fact, a very very quiet place, or, a very very noisy and disorganised place, rarely if ever the sweet spot in the middle.
And before any sighted person shouts it’s just like this for us –staring at this text within a lovely backdrop of a gently glowing screen with architectural frames and designs and obvious toolbars of controls to help them on their way to understanding what they are looking at– no it’s completely different.
I mean noisy because unlike your eye, which you can instantly glance around the page, take in shapes and regions, read and re-read words or lines of text as you wish and as your brain needs in order to digest it, a screen reader is just plain old noisy and messy.
In fact, relying on a screen reader is like being read to by a jerky and loud voiced neighbour who doesn’t understand anything about layout or what you want to focus on, and just splurges out what they are seeing at you, and the only options to direct and control them are a few arrow keys which instantly cause them to jump onto something else and read that at you, until you press another key to stop them or jump somewhere else. They seem to totally ignore images or symbols, or layout, as if it means nothing.
So where exactly they are gazing on the screen and reading out from, or how it relates to anything else on the page, is mostly inside their head, never to reach yours. I’m always glad to walk away from this neighbour in my life, but sadly I don’t have any other real choice on my work computer or home desktop.
Thankfully this annoying auto-person is less uncontrollable on my Android phone, which is a touch screen, I can actually touch each sentence, moving my finger down and feeling the phone’s tactile feedback tell me each time I “bump” onto a new sentence. It’s so much closer to what I remember feeling the process of reading with my eye was. I still have no real left and right tactile feel for the words across the page, but it is almost there when I use Google Docs ap, quite glitchy, but hey, it’s quite close now to my fingertip being able to touch and feel words, and hear it read to me by the TalkBack at the same time.
Roll on getting rid of that annoying screen-reader neighbour.
I left behind my keypad based phone (Nokia C5) over two months ago and happy to say I’m doing fine on a touch screen device with not a physical button in sight except for the on off and volume up and down, which by the way also moves the cursor focus forward and back by a character at a time when in edit boxes! Well, depends on the edit box, it works on some and not on others although i cannot remember which now. This isn’t one of those fancy technical postings by the way where you can learn facts and figures it’s just my gut reactions.
More on my gut reactions:
Android KitKat speaking voice is a victory design – i actually love it not just like it or put up with it – it’s not scratchy or digital sounding like on the iPhone voice and she just sounds on the right side of informative and enthusiastic and it’s rare she lets me down. I think this is “the benchmark voice” for a handheld mobile device if anyone else agrees let me know!
The tactile feedback is also a victory for anyone who benefits or even just likes sensory input on a physical level as well as an auditory level. I really miss it now on my Nexus7 tablet and find myself scrubbing around much more tediously for buttons compared to the MotoE phone.
I like and need to use my mobile as a mobile – not just a standing still phone – and the combo of clear voice on the Android KitKat operating system with the tactile feedback has, in many situations, now met my other benchmark of using my mobile on the move.
Interesting observation: for sighted people who stare at their phones as they walk along and therefore render themselves temporarily partially sighted and bump into other people I’m happy to say this does not happen to me despite being officially visually impaired! i have discovered that the spatial bit of my brain isn’t disrupted by interacting with this touch screen phone (spatial interface) as I’m navigating along compared with my previous button based Nokia C5 (conventional tab order interface), i think it’s because operating key presses and a tab order style interface uses a different part of the brain and therefore conflicts with walking along. This is something i wasn’t expecting but i do think the tactile feedback is the key factor because i did not discover this effect with an iPhone which has no tactile feedback.
I have activated the so called ‘experimental single tap’ function which i never really understood what it did and still don’t but assume it means when i reach the button or link or item i want i can just lift my finger off very slightly and tap down just once at that precise point on the surface and that activates it. If this is how it is working then i commend it because it’s led to me interacting much more spatially with the X Y layout and freed me from swiping to reach controls (with no idea where they are located on the screen) which is in effect just a tab order way of using the phone. I now find my muscle memory remembering where controls are on the surface in lots of different applications and my speed of interaction has increased hugely. In fact, on applications i use a lot, i am now moving perhaps nearly as deftly as a sighted person (for whom all this interface stuff is heavily design and obviously optimised). I’m not getting to excited though, as soon as my interaction speed goes up on familiar aps i find myself getting even more frustrated with less familiar aps.
_End and a global call to action!_
Surviving and suffering far too many problems some of which are ridiculously basic. Let’s focus on one – the sheer number of unlabelled buttons on third party aps is a scandal and even Google creations are littered with them. The BBC MediaPlayer seems to have not a single button even present let alone labelled i really don’t understand what’s going on there at all. This is really poor and maybe proves no one is checking or testing their creations with TalkBack – if they are why are they not noticing this massive problem?
Labelling buttons isn’t about sticking to rules or boring ideas like that, this is about companies who are creating really interesting and often brilliant things, actually living the dream, but if the only dream they are living is their own, and sticking signs on the doors with no text, then that’s not in the long run going to unlock the doors so everyone can join the fun.
I’m going to conclude this posting with a simple global call to action – test what you create with TalkBack to make sure you’ve added decent text labels on all of the buttons and controls – it is easy to do. Here’s how: activate TalkBack and explore by touch on your device (built in and free of charge) – then sit back, watch the TV if you want, and listen as you explore your ap. Identify and fix buttons which Talk Back says “button 51″ or whatever number it is. This is an unlabelled button. If it’s a Play button or close and go back button, label it as Play or Go back! For those buttons with a symbol or other marker that is supposed to make sense to everybody (but often doesn’t) at least give it a text label that communicates the symbol on that button, if it’s an X or a left arrow, add X or Left Arrow as the text label.
Time please! You are a busy person i know, but every unlabelled button affects other busy people too and the upside is every time you make a button accessible by labelling it you are opening that door to someone who might be sitting next to you on the tube the very next day, hoping to use your creation, and you’ll be directly saving them time and confusion and making their day a good one rather than a tedious one. So it’s worth it! Do it! It’s a bit like a suspended coffee, you’ll never drink it, but you know how good coffee is when you do and you are passing that joy onto a fellow citizen.
Thanks for the label!
Ok, it’s happened, my Nokia C5 has finally exited my pocket and the slightly larger heavier moto-E phone has replaced it. My transition has been long and slow and this has I think reflected the neurological adaptation that has had to take place in my brain. Simple as that.
Pros: with Moto-E held in an ergonomic position in left hand and right fingertip sliding around the slippy screen, and with “experimental single tap” mode on, and with my frustration shield up (i.e. I don’t expect Talk Back to work smoothly) this phone works, for ringing people, reading my text SMS threads, sending Texts, email and Gmail, messy and jittery web browsing, and experimenting with Aps.
For a person who can cope with a phone that won’t answer calls roughly 20% of the time because the slide right to answer gesture doesn’t always work (double finger tap on my previous iPhone didn’t either by the way), and doesn’t mind having to physically stop walking on the street in order to do anything with the phone whether reading or messaging or using orientation Aps or searching for an address, the sorts of things that a handheld device is most useful for when out and about, making this a mobile device providing you aren’t actually walking and trying to operate it at the same time, then this phone hits the yes button!
Cons: unlike my Nokia C5, this type of touch screen phone cannot usefully facilitate answering calls, reading and writing texts, reading web pages and instantly getting and refreshing bus departure boards on TFL etc when I’m actually walking along with my white cane and navigating the pavement. Strange as it might sound, I got used to speeding out of my office and down to the bus stop and pulling up and checking live bus departure boards whilst striding along, within safety margins. This is one of the real downsides for me with the whole slippy touch screen user interface, and this applies to all devices of this type, not just Android.
I’m not going to write loads in this posting, but I can’t finish without inevitably commenting on the Apple v Google comparison which is very much a topic of conversation these days, so on this I have to conclude, from 3 intensive weeks living with this Moto-E as my only phone, that what Google have achieved so far for an inclusive society is this
–Google have definitely opened up this Moto-E and I guess all other Android smartphones on the market (at a higher price) so I can get to at least play with it, but this isn’t enough, I need to live with my smartphone not just experiment and play with it. Google’s attention to detail in terms of delivering Talk Back as a smooth, efficient and positive user interaction is demonstrably error prone and really poor in places, and generally feeling like a prototype not a finished product.
Final thought, although for some reason I like the idea of being an Android user, oddly, the effect of my experience with Android and Talk back, is that it’s wetted my appetite for what a smartphone that does enable a more smooth and successful mode of interaction could offer me, so I might upgrade to an Apple device sooner rather than later.
Last night for some reason the new version of BBC iPlayer suddenly appeared on the Apps screen on my Nexus7. The device secretly updates itself all the time so i wasn’t that surprised.
Did i tap the icon with a sense of a world of wonderful BBC radio about to grace my ears? No, because Up until now even the mention of these Apps raised my blood pressure into the red and i avoided tapping them for my own sanity and i’m not joking.
Anyway, 6 months of non-BBC interaction via my Nexus7 gave way to a momentary impulse to tap the icon and just see if anything had improved…
Aha! Can i believe it??? I’ve got to believe it! Two minutes later and i was listening to Costing the Earth on listen again and i actually felt like i had positively interacted with the controls, as simulated into my head a mental map of the entire visual layout and got enough positive reinforcement from my finger movement and hitting the objects i expected to hit, that finally a wonderful harmony rang out between my mental map and the device and the BBC iPlayer.
Harmony – it’s the user experience that tells me the interface is working – now i want more!
This so called “ring that scans and reads text” might prove to be a lovely way for me and others to return to reading paper books again, but I do wonder why the press release ignores the dawning of the text-to-speech enabled eBook? It does the same thing without having to concentrate on guiding a little camera along a line of text and keeping straight with buzzing fingertips!
For me being able to read all those bits of paper that are given to me by public services, the NHS, commercial services, would be far more valuable and liberating.
I had no idea just how much a pair of leather and metal headphones can radically enhance user experience!
I spend 8 hours + per day inside headphones and I realised my ears needed a sofa not a plastic chair! I can really say this pair of Bower & Wilkins P5’s are really working for me.
If my computing life can be likened to a lounge, it just feels easier to relax now nestled in my leather ear-sofa, I care less about the moody devices that I cannot control!
p.s. if anyone needs reminding, I listen to screen readers talking at me all day I don’t get time to listen to music much. This posting is about comfort, long term weraability and how this can modify one’s experience of using devices eyes-free.
Just spent 15 minutes trying to turn off attachments preview in MS Outlook 2010.
My sighted office colleague and I struggled to make sense of the way the ribbon and menus worked. I have never come across such a seemingly crazy set up as this before!
I found I needed to use every keyboard navigation trick I’ve ever learned –tree expanding and collapsing, descending list, horizontal list, tab panel, combo drop down, and I’m sure there was even a combo pop up explode collapse disappear thing in there too– a real mish mash. I just could not form a mental map of what I was doing at all and without this, I was geographically lost – it’s official!
It’s made me think about “audible architecture” something I am familiar with when I’m working with urban street design people, but does it exist in software interface design?
A web search reveals all sorts of software architects and I guess someone did “design” the way the Outlook 2010 menus work, but it sure doesn’t strike me that there is any audible design logic in there, or if there is, it doesn’t make sense to me!
if the designers have invented / ended up with yet another audible contortion tangle around a visually optimised design approach, then I will happily direct them to the nearest purveyor of straitjackets and invite them to try out living their life with that on and see if they think that is inclusion…