Skip to content

Opinion – Google Docs and Google Drive – almost the best thing since sliced bread and definitely not half baked

just a quick opinion thing about using Google Docs and Google Drive on my Android, eyes-free of course. It’s so close to being literally amazing, but the bugs and glitches with it let it down much more than if it was the usual mediocre attempt at accessibility that is the norm. So close but not close enough to land the prize of being the best document reading, creating and sharing solution since sliced bread… (erm, notice a slight idiom breakdown there…)

I’ll post some specifics soon on this..

Is NHS 1605 the third big game changer?

Without doubt the new accessible information standard 1605 announced last Friday by NHS England is THE biggest public sector thing to happen to accessible information over the last decade and I think will have a much bigger impact than the accessible information bits specified in the Equality Act 2010. Sad to say I don’t think the Equality Act 2010 really changed any games, maybe it set a new rule, but few followed them and even fewer have used the law to secure their rights. 1605 secures the right for you and inspections will be carried out by the Care Quality Commission, saving blind people from having to take on cases themselves.

For the record the two other big game changers are what Apple and what Google have done by embedding accessibility into their smartphones, tablets and computers for free. Anyone who can use these devices has a drastically reduced need for transcription services or special alternative format provisions because the device can speak or make a large print version on-the-fly, importantly, of the mainstream message, email, document or web page that the company or service produces as standard for everyone else. It’s a win win business strategy because it brings the blind person into the mainstream.

New NHS standard effectively brings blind people into the mainstream too and should be a win win as well.

hey i’ve just been sent an “innovation alert” fancy that?

Interestingly I just got an “innovation alert” emailed to me from HP, the printer company, and like usual I don’t have time to watch (in my case listen) to the video, it’s probably not accessibly text described anyway, but I started to imagine what the alert might contain…

Video imagery starts with rapid fire shots of conventional laser jet printers spewing out laser sharp print outs, but then images slow down, focusing on lots of people’s hands, touching, shaping, building things, building lovely things, building amazing things…

Soundtrack is of a low toned and tensely confident American sounding woman saying- “At HP we are bored of trying to exceed the ever finer and finer detail and ever richer and richer colours that everyone else in the printing industry focuses on.

To be honest, printed imagery could be made even sharper and more colourful, but is this really what people want?

We are also pretty bored of two dimensional images and with fleecing people with the cost of expensive and polluting inks and toners, so we’ve decided to leap into a new world.

We’ve decided the thing that makes human beings so special is our amazing sense of touch, of physical interaction with the world and with the idea of turning ideas in our heads into real touchable objects. Yes yes that is just 3D printing, nothing new in that, yes it’s cool but no it’s not enough for an innovation alert…

So what is enough for an innovation alert? What is big enough for an HP innovation alert? Well, here it is: our new range of 3D printers aren’t printers at all, they are beautiful wireless robots, the most dexterous sculptors you can ever meet, download your design, touch go on Robo-angelo, sit back and watch while the new generation of sculptors create your design, on concrete, on wood, on plastic, on food, on anything, what you can imagine Robo-Angelo can shape up for you to touch…”

Ar I’ve just spent longer writing this than if I’d watched the video. I wonder if the video does actually say this….!

Six years – remembering Sam Puttick and his parents Neil and Kazumi

Today is remembering Sam Puttick and his parents day.

Six years ago Friday just gone, I was walking into the office where I’m sitting now, and Sam’s dad Neil texted me to say always remember Sam, his 5 year old son at the time, who died shortly after on 30th May 2009, of meningitis.

Neil and Kazumi had kept Sam alive after a road accident which had injured Sam so severely he had to be ventilated and had no movement from the neck down. If I’d known how deep their struggles were in the years to follow, I’d have moved in next door to help, but somehow we didn’t, so we didn’t.

But in those short years they lived a whole life. They were survivors and livers of life, but in a humble and honest way. Neil fought for his son and wife throughout, telling me of his many struggles with social barriers, isolating behaviours from people who they’d counted on as friends, unjust and unfair treatment from insurance companies, deep difficulties with family relationships, all on top of his single mission to keep his son alive long enough for a medical breakthrough that would help his son regain some movement.

All three of them were soldiers, survivors, serving the mission of life, pushing beyond the human endurance that the constant bombardment of emotional and physical torments could throw at them. Alas like the common foot soldier, they were felled by the risks of their mission, going down like the first violinist on the Titanic, playing on.

Of course they could have made it, just like a raindrop can make it when it falls onto a dry field and a thousand years later bubbles up in a spring, but for them, as it is for many raindrops, the journey ended when it did. Sam died on 30th May and soon after Neil and Kazumi threw in the towel. After six years of wondering, I think I can understand now why they did what they did.

So here’s remembering Sam Puttick and his courageous mum and dad, Kazumi, and Neil Puttick.

I’m typing this in my office and I refuse to cry in front of everyone. Neil wouldn’t have, so neither will I. I’m looking at the bright daylight and hearing the London traffic, life goes on, and that is why I’m remembering, so it goes on in a thoughtful way, keeping the spring bubbling.

And let’s celebrate the things we do to treat each other inclusively – when we recoil from a person who initially scares us take a moment to turn back and re-engage – this is the behaviour which makes our society inclusive and is what makes life good.

125,000 people are about to enter the digital world – but will accessibility glitches mean they make a swift exit or are developer attitudes really going to change?..

News: a national charity has recently launched a major programme to help get 125,000 blind and partially sighted people online.

This is big, it amounts to a third of every blind and partially sighted person in the UK and well over the 80,000 of working age. How many will successfully dump traditional methods of accessing information (large print, braille, audio) and become full digital citizens remains to be seen.

Could the bottom be about to drop out of the market for alternative formats? I’m not sure. I wonder how big the market is anyway or whether the UK association for accessible formats has a view on this.

What I am sure is this huge new initiative is unlikely to get local authorities thinking seriously about the need to digitally rehab b/ps people, because being able to use a white cane is increasingly less important than being able to function digitally. Same goes for the Ap and website developer communities, i wonder if it’ll get them thinking and designing inclusively?

Let’s face it, who is putting big bucks in teaching and training ap and web developers to design inclusive? I don’t see anyone. These millions of pounds to fund work to get so many people with sight impairments online could actually just be sending over a hundred thousand people into a digital world of partly or completely inaccessible aps and websites, to drive them mad with frustration. On the bright side, driving up demand for inclusive design could actually increase the developer communities awareness of accessibility. I hope it’s the latter.

What we do know from practical example is the potential for accessibility has changed a whole lot in the last 5 years. Google and Apple have put the power to instantly enlarge screen text or to read it aloud in the hands of blind and partially sighted people everywhere, well those with a modest monthly budget to spend, thanks to the ever improving generations of smartphones and tablets that fill pockets and litter sofas worldwide.

But turning to assess the other side of the coin, the potential the ap and web developer communities are generating, we can’t be so sure the potential is carried through. I think many many people will get stuck somewhere on the steep climbs of the learning curve, hit overhanging aps and mobile sites which won’t enlarge or read out loud, then slide back down. I only hope traditional alternative formats will still be there for them at the bottom…

New paradigm needed for non-visual interaction with visual interfaces

Ok need to get this off my chest. I’ve thought this for over 15 years but not done anything about it. Maybe someone else can…

Why is it screen readers continue to be designed so they fail to exploit the full capacity of the human touch and hearing capacity available?

For sighted people, it’s the same as if the days of the low-res blocky green screen continue just because no one has thought to develop the display beyond this idea. For us people who interact with our computers through sound and touch, the experience remains mostly, with some slight exceptions, a flatly monophonic experience, no sense of left and right, no up or down, no deep or shallow, no sonic effects or acoustically different spaces within our headphones or speakers, it’s like being on the phone.

The ridiculous thing is, the potential for high definition acoustics and binaural (spatial sound localisation) and bi-manual interaction on even the cheapest smartphone is right here and right now. For some reason, the dev community just aren’t creatively utilising it.

Why web accessibility is so much more than coding for screen reader compatibility

People like me who listen to web content instead of looking at a mass of stuff on the visual screen, know that instead of the web looking packed with info and images, it is in fact, a very very quiet place, or, a very very noisy and disorganised place, rarely if ever the sweet spot in the middle.

And before any sighted person shouts it’s just like this for us –staring at this text within a lovely backdrop of a gently glowing screen with architectural frames and designs and obvious toolbars of controls to help them on their way to understanding what they are looking at– no it’s completely different.

I mean noisy because unlike your eye, which you can instantly glance around the page, take in shapes and regions, read and re-read words or lines of text as you wish and as your brain needs in order to digest it, a screen reader is just plain old noisy and messy.

In fact, relying on a screen reader is like being read to by a jerky and loud voiced neighbour who doesn’t understand anything about layout or what you want to focus on, and just splurges out what they are seeing at you, and the only options to direct and control them are a few arrow keys which instantly cause them to jump onto something else and read that at you, until you press another key to stop them or jump somewhere else. They seem to totally ignore images or symbols, or layout, as if it means nothing.

So where exactly they are gazing on the screen and reading out from, or how it relates to anything else on the page, is mostly inside their head, never to reach yours. I’m always glad to walk away from this neighbour in my life, but sadly I don’t have any other real choice on my work computer or home desktop.

Thankfully this annoying auto-person is less uncontrollable on my Android phone, which is a touch screen, I can actually touch each sentence, moving my finger down and feeling the phone’s tactile feedback tell me each time I “bump” onto a new sentence. It’s so much closer to what I remember feeling the process of reading with my eye was. I still have no real left and right tactile feel for the words across the page, but it is almost there when I use Google Docs ap, quite glitchy, but hey, it’s quite close now to my fingertip being able to touch and feel words, and hear it read to me by the TalkBack at the same time.

Roll on getting rid of that annoying screen-reader neighbour.