Skip to content

hey i’ve just been sent an “innovation alert” fancy that?

Interestingly I just got an “innovation alert” emailed to me from HP, the printer company, and like usual I don’t have time to watch (in my case listen) to the video, it’s probably not accessibly text described anyway, but I started to imagine what the alert might contain…

Video imagery starts with rapid fire shots of conventional laser jet printers spewing out laser sharp print outs, but then images slow down, focusing on lots of people’s hands, touching, shaping, building things, building lovely things, building amazing things…

Soundtrack is of a low toned and tensely confident American sounding woman saying- “At HP we are bored of trying to exceed the ever finer and finer detail and ever richer and richer colours that everyone else in the printing industry focuses on.

To be honest, printed imagery could be made even sharper and more colourful, but is this really what people want?

We are also pretty bored of two dimensional images and with fleecing people with the cost of expensive and polluting inks and toners, so we’ve decided to leap into a new world.

We’ve decided the thing that makes human beings so special is our amazing sense of touch, of physical interaction with the world and with the idea of turning ideas in our heads into real touchable objects. Yes yes that is just 3D printing, nothing new in that, yes it’s cool but no it’s not enough for an innovation alert…

So what is enough for an innovation alert? What is big enough for an HP innovation alert? Well, here it is: our new range of 3D printers aren’t printers at all, they are beautiful wireless robots, the most dexterous sculptors you can ever meet, download your design, touch go on Robo-angelo, sit back and watch while the new generation of sculptors create your design, on concrete, on wood, on plastic, on food, on anything, what you can imagine Robo-Angelo can shape up for you to touch…”

Ar I’ve just spent longer writing this than if I’d watched the video. I wonder if the video does actually say this….!

Six years – remembering Sam Puttick and his parents Neil and Kazumi

Today is remembering Sam Puttick and his parents day.

Six years ago Friday just gone, I was walking into the office where I’m sitting now, and Sam’s dad Neil texted me to say always remember Sam, his 5 year old son at the time, who died shortly after on 30th May 2009, of meningitis.

Neil and Kazumi had kept Sam alive after a road accident which had injured Sam so severely he had to be ventilated and had no movement from the neck down. If I’d known how deep their struggles were in the years to follow, I’d have moved in next door to help, but somehow we didn’t, so we didn’t.

But in those short years they lived a whole life. They were survivors and livers of life, but in a humble and honest way. Neil fought for his son and wife throughout, telling me of his many struggles with social barriers, isolating behaviours from people who they’d counted on as friends, unjust and unfair treatment from insurance companies, deep difficulties with family relationships, all on top of his single mission to keep his son alive long enough for a medical breakthrough that would help his son regain some movement.

All three of them were soldiers, survivors, serving the mission of life, pushing beyond the human endurance that the constant bombardment of emotional and physical torments could throw at them. Alas like the common foot soldier, they were felled by the risks of their mission, going down like the first violinist on the Titanic, playing on.

Of course they could have made it, just like a raindrop can make it when it falls onto a dry field and a thousand years later bubbles up in a spring, but for them, as it is for many raindrops, the journey ended when it did. Sam died on 30th May and soon after Neil and Kazumi threw in the towel. After six years of wondering, I think I can understand now why they did what they did.

So here’s remembering Sam Puttick and his courageous mum and dad, Kazumi, and Neil Puttick.

I’m typing this in my office and I refuse to cry in front of everyone. Neil wouldn’t have, so neither will I. I’m looking at the bright daylight and hearing the London traffic, life goes on, and that is why I’m remembering, so it goes on in a thoughtful way, keeping the spring bubbling.

And let’s celebrate the things we do to treat each other inclusively – when we recoil from a person who initially scares us take a moment to turn back and re-engage – this is the behaviour which makes our society inclusive and is what makes life good.

125,000 people are about to enter the digital world – but will accessibility glitches mean they make a swift exit or are developer attitudes really going to change?..

News: a national charity has recently launched a major programme to help get 125,000 blind and partially sighted people online.

This is big, it amounts to a third of every blind and partially sighted person in the UK and well over the 80,000 of working age. How many will successfully dump traditional methods of accessing information (large print, braille, audio) and become full digital citizens remains to be seen.

Could the bottom be about to drop out of the market for alternative formats? I’m not sure. I wonder how big the market is anyway or whether the UK association for accessible formats has a view on this.

What I am sure is this huge new initiative is unlikely to get local authorities thinking seriously about the need to digitally rehab b/ps people, because being able to use a white cane is increasingly less important than being able to function digitally. Same goes for the Ap and website developer communities, i wonder if it’ll get them thinking and designing inclusively?

Let’s face it, who is putting big bucks in teaching and training ap and web developers to design inclusive? I don’t see anyone. These millions of pounds to fund work to get so many people with sight impairments online could actually just be sending over a hundred thousand people into a digital world of partly or completely inaccessible aps and websites, to drive them mad with frustration. On the bright side, driving up demand for inclusive design could actually increase the developer communities awareness of accessibility. I hope it’s the latter.

What we do know from practical example is the potential for accessibility has changed a whole lot in the last 5 years. Google and Apple have put the power to instantly enlarge screen text or to read it aloud in the hands of blind and partially sighted people everywhere, well those with a modest monthly budget to spend, thanks to the ever improving generations of smartphones and tablets that fill pockets and litter sofas worldwide.

But turning to assess the other side of the coin, the potential the ap and web developer communities are generating, we can’t be so sure the potential is carried through. I think many many people will get stuck somewhere on the steep climbs of the learning curve, hit overhanging aps and mobile sites which won’t enlarge or read out loud, then slide back down. I only hope traditional alternative formats will still be there for them at the bottom…

New paradigm needed for non-visual interaction with visual interfaces

Ok need to get this off my chest. I’ve thought this for over 15 years but not done anything about it. Maybe someone else can…

Why is it screen readers continue to be designed so they fail to exploit the full capacity of the human touch and hearing capacity available?

For sighted people, it’s the same as if the days of the low-res blocky green screen continue just because no one has thought to develop the display beyond this idea. For us people who interact with our computers through sound and touch, the experience remains mostly, with some slight exceptions, a flatly monophonic experience, no sense of left and right, no up or down, no deep or shallow, no sonic effects or acoustically different spaces within our headphones or speakers, it’s like being on the phone.

The ridiculous thing is, the potential for high definition acoustics and binaural (spatial sound localisation) and bi-manual interaction on even the cheapest smartphone is right here and right now. For some reason, the dev community just aren’t creatively utilising it.

Why web accessibility is so much more than coding for screen reader compatibility

People like me who listen to web content instead of looking at a mass of stuff on the visual screen, know that instead of the web looking packed with info and images, it is in fact, a very very quiet place, or, a very very noisy and disorganised place, rarely if ever the sweet spot in the middle.

And before any sighted person shouts it’s just like this for us –staring at this text within a lovely backdrop of a gently glowing screen with architectural frames and designs and obvious toolbars of controls to help them on their way to understanding what they are looking at– no it’s completely different.

I mean noisy because unlike your eye, which you can instantly glance around the page, take in shapes and regions, read and re-read words or lines of text as you wish and as your brain needs in order to digest it, a screen reader is just plain old noisy and messy.

In fact, relying on a screen reader is like being read to by a jerky and loud voiced neighbour who doesn’t understand anything about layout or what you want to focus on, and just splurges out what they are seeing at you, and the only options to direct and control them are a few arrow keys which instantly cause them to jump onto something else and read that at you, until you press another key to stop them or jump somewhere else. They seem to totally ignore images or symbols, or layout, as if it means nothing.

So where exactly they are gazing on the screen and reading out from, or how it relates to anything else on the page, is mostly inside their head, never to reach yours. I’m always glad to walk away from this neighbour in my life, but sadly I don’t have any other real choice on my work computer or home desktop.

Thankfully this annoying auto-person is less uncontrollable on my Android phone, which is a touch screen, I can actually touch each sentence, moving my finger down and feeling the phone’s tactile feedback tell me each time I “bump” onto a new sentence. It’s so much closer to what I remember feeling the process of reading with my eye was. I still have no real left and right tactile feel for the words across the page, but it is almost there when I use Google Docs ap, quite glitchy, but hey, it’s quite close now to my fingertip being able to touch and feel words, and hear it read to me by the TalkBack at the same time.

Roll on getting rid of that annoying screen-reader neighbour.

global call to label buttons plus an update on using a bottom-end Android phone with TalkBack

_Beginning_
I left behind my keypad based phone (Nokia C5) over two months ago and happy to say I’m doing fine on a touch screen device with not a physical button in sight except for the on off and volume up and down, which by the way also moves the cursor focus forward and back by a character at a time when in edit boxes! Well, depends on the edit box, it works on some and not on others although i cannot remember which now. This isn’t one of those fancy technical postings by the way where you can learn facts and figures it’s just my gut reactions.

_Middle_
More on my gut reactions:

Android KitKat speaking voice is a victory design – i actually love it not just like it or put up with it – it’s not scratchy or digital sounding like on the iPhone voice and she just sounds on the right side of informative and enthusiastic and it’s rare she lets me down. I think this is “the benchmark voice” for a handheld mobile device if anyone else agrees let me know!

The tactile feedback is also a victory for anyone who benefits or even just likes sensory input on a physical level as well as an auditory level. I really miss it now on my Nexus7 tablet and find myself scrubbing around much more tediously for buttons compared to the MotoE phone.

I like and need to use my mobile as a mobile – not just a standing still phone – and the combo of clear voice on the Android KitKat operating system with the tactile feedback has, in many situations, now met my other benchmark of using my mobile on the move.

Interesting observation: for sighted people who stare at their phones as they walk along and therefore render themselves temporarily partially sighted and bump into other people I’m happy to say this does not happen to me despite being officially visually impaired! i have discovered that the spatial bit of my brain isn’t disrupted by interacting with this touch screen phone (spatial interface) as I’m navigating along compared with my previous button based Nokia C5 (conventional tab order interface), i think it’s because operating key presses and a tab order style interface uses a different part of the brain and therefore conflicts with walking along. This is something i wasn’t expecting but i do think the tactile feedback is the key factor because i did not discover this effect with an iPhone which has no tactile feedback.

I have activated the so called ‘experimental single tap’ function which i never really understood what it did and still don’t but assume it means when i reach the button or link or item i want i can just lift my finger off very slightly and tap down just once at that precise point on the surface and that activates it. If this is how it is working then i commend it because it’s led to me interacting much more spatially with the X Y layout and freed me from swiping to reach controls (with no idea where they are located on the screen) which is in effect just a tab order way of using the phone. I now find my muscle memory remembering where controls are on the surface in lots of different applications and my speed of interaction has increased hugely. In fact, on applications i use a lot, i am now moving perhaps nearly as deftly as a sighted person (for whom all this interface stuff is heavily design and obviously optimised). I’m not getting to excited though, as soon as my interaction speed goes up on familiar aps i find myself getting even more frustrated with less familiar aps.

_End and a global call to action!_
Surviving and suffering far too many problems some of which are ridiculously basic. Let’s focus on one – the sheer number of unlabelled buttons on third party aps is a scandal and even Google creations are littered with them. The BBC MediaPlayer seems to have not a single button even present let alone labelled i really don’t understand what’s going on there at all. This is really poor and maybe proves no one is checking or testing their creations with TalkBack – if they are why are they not noticing this massive problem?

Labelling buttons isn’t about sticking to rules or boring ideas like that, this is about companies who are creating really interesting and often brilliant things, actually living the dream, but if the only dream they are living is their own, and sticking signs on the doors with no text, then that’s not in the long run going to unlock the doors so everyone can join the fun.

I’m going to conclude this posting with a simple global call to action – test what you create with TalkBack to make sure you’ve added decent text labels on all of the buttons and controls – it is easy to do. Here’s how: activate TalkBack and explore by touch on your device (built in and free of charge) – then sit back, watch the TV if you want, and listen as you explore your ap. Identify and fix buttons which Talk Back says “button 51″ or whatever number it is. This is an unlabelled button. If it’s a Play button or close and go back button, label it as Play or Go back! For those buttons with a symbol or other marker that is supposed to make sense to everybody (but often doesn’t) at least give it a text label that communicates the symbol on that button, if it’s an X or a left arrow, add X or Left Arrow as the text label.

Time please! You are a busy person i know, but every unlabelled button affects other busy people too and the upside is every time you make a button accessible by labelling it you are opening that door to someone who might be sitting next to you on the tube the very next day, hoping to use your creation, and you’ll be directly saving them time and confusion and making their day a good one rather than a tedious one. So it’s worth it! Do it! It’s a bit like a suspended coffee, you’ll never drink it, but you know how good coffee is when you do and you are passing that joy onto a fellow citizen.

Thanks for the label!

Do Androids dream of accessible phones?

Ok, it’s happened, my Nokia C5 has finally exited my pocket and the slightly larger heavier moto-E phone has replaced it. My transition has been long and slow and this has I think reflected the neurological adaptation that has had to take place in my brain. Simple as that.

Pros: with Moto-E held in an ergonomic position in left hand and right fingertip sliding around the slippy screen, and with “experimental single tap” mode on, and with my frustration shield up (i.e. I don’t expect Talk Back to work smoothly) this phone works, for ringing people, reading my text SMS threads, sending Texts, email and Gmail, messy and jittery web browsing, and experimenting with Aps.

For a person who can cope with a phone that won’t answer calls roughly 20% of the time because the slide right to answer gesture doesn’t always work (double finger tap on my previous iPhone didn’t either by the way), and doesn’t mind having to physically stop walking on the street in order to do anything with the phone whether reading or messaging or using orientation Aps or searching for an address, the sorts of things that a handheld device is most useful for when out and about, making this a mobile device providing you aren’t actually walking and trying to operate it at the same time, then this phone hits the yes button!

Cons: unlike my Nokia C5, this type of touch screen phone cannot usefully facilitate answering calls, reading and writing texts, reading web pages and instantly getting and refreshing bus departure boards on TFL etc when I’m actually walking along with my white cane and navigating the pavement. Strange as it might sound, I got used to speeding out of my office and down to the bus stop and pulling up and checking live bus departure boards whilst striding along, within safety margins. This is one of the real downsides for me with the whole slippy touch screen user interface, and this applies to all devices of this type, not just Android.

I’m not going to write loads in this posting, but I can’t finish without inevitably commenting on the Apple v Google comparison which is very much a topic of conversation these days, so on this I have to conclude, from 3 intensive weeks living with this Moto-E as my only phone, that what Google have achieved so far for an inclusive society is this
–Google have definitely opened up this Moto-E and I guess all other Android smartphones on the market (at a higher price) so I can get to at least play with it, but this isn’t enough, I need to live with my smartphone not just experiment and play with it. Google’s attention to detail in terms of delivering Talk Back as a smooth, efficient and positive user interaction is demonstrably error prone and really poor in places, and generally feeling like a prototype not a finished product.

Final thought, although for some reason I like the idea of being an Android user, oddly, the effect of my experience with Android and Talk back, is that it’s wetted my appetite for what a smartphone that does enable a more smooth and successful mode of interaction could offer me, so I might upgrade to an Apple device sooner rather than later.

Follow

Get every new post delivered to your Inbox.