Skip to content

Customers – take back the definition of ‘accessibility’ don’t let companies define it to suit their needs over yours

I just searched online to query the accessibility of Microsoft’s new Edge browser because it fired up on my new Windows 10 laptop and I can’t access and read anything the Edge browser shows.

I instantly found various search results which suggests either the mighty Microsoft just haven’t been able to make Edge work for customers who use screen readers, or, they have rolled out their product without having completed the accessibility aspects of the product.

(The search results I’m commenting on are pasted at the bottom of this post)

A search result from this year (2017 at time of writing) calls for accessibility of Microsoft Edge, next to an adjacent search result from almost two years ago in 2015 where Microsoft imply they will deliver accessibility to the Edge browser. Most significantly to me, Microsoft promote their view that accessibility development is a “journey” not a “destination”.

If like me you are one of Microsoft’s customers who come with an impairment, is this an inspiring philosophy or a flop?

It’s a flop. I think it is time for the key shapers of our future livelyhoods and employability to stop stretching and bending the meaning of the word “accessibility” from what they’ve managed to turn it into (a never-ending learning process) and start delivering products that work for the average customer, and that includes the average blind customer, the average hearing impaired customer, of which there are millions and millions worldwide.

This isn’t an unreasonable ask. Microsoft is a very big company and delivers products into very big markets and that means very large numbers of customers with impairments and disabilities of one sort or another are part of their target markets.

The definition of a working product is easy too –bottom line- it’s one that doesn’t get complained about.

Sell it straight

Any product that hasn’t been tested or is known not to work as advertised for certain types of customers, then it should state that on the label and the product description. Vendors should not try to hide or mislead customers, just say it as it is, if it doesn’t work for blind customers, put that on the label.

All customers must be able to make informed choices when buying and entering into contracts with companies and that starts with clear and accurate product descriptions. Promising accessibility but then not delivering on that dimension of a product is just not a fair or useful way of dealing with this challenge, and certainly not one that a mega-corp can or should ever think is acceptable.

defensive narratives

Walking on egg shells seems to be what advocates and accessibility consultants often do when it comes to challenging mega-corps on their delivery of proper usable products and I think this may be a result of defensive narratives. I think the whole journey not destination narrative comes out of a simple under-funding and under-resourcing of the part of product development that relates to usability for customers with impairments and disabilities.

I also fear many companies, even the largest in the world, are almost certainly avoiding and perhaps even creating defensive narratives to hide behind when it comes to the way their products perform for known user groups, such as blind customers.

Whoever controls accessibility controls the outcome

The bigger issue here though I think is how influential software manufacturers seem to have taken control of what accessibility is. They have re-defined and translated it into something it isn’t, to me they’ve turned it from a simple story from the customer’s perspective “can I use this product and ddoes it work as advertised” into a story more like from a film script, of passion, endeavour and aspiration, a talisman of their desire to overcome advertsity, a never-ending story. The only problem is, just like a film, it is all about parody and desire, and little about real ordinary people’s lives.

Time to take back the definition of accessibility

Customers should not allow companies to define and control what accessibility is for them, they must take it back because it is theirs.

A successful company wil be one that is bold enough, open enough and strong enough to push the definition of accessibility back to their customers. It will not talk about development journeys that have no destinations, it will say very little about it, instead it will deliver products and count the complaints and deliver products that don’t get complaints, that’s it.

Treat all of your customers the same, they all deserve equal treatment.

Here are the search results that led me to questioning what is going on with accessibility here at the end of 2017:

24 Jan 2017 – 2 posts

Make Microsoft edge more screen reader accessible. Edge not accessible with JAWS 17. … JAWS 17 does not read everything in eigther windows mail or definitely even less with Microsoft edge.

Accessibility: Towards a more inclusive web with Microsoft Edge and ……/accessibility-towards-a-more-inclusive-web-…

(Info! Jaws is an internationally popular screen reader application for windows computers which blind people use and it speaks out text and controls that are displayed on the screen amongst other things)

25 Sep 2015 – Microsoft is committed to accessibility as a core part of software design, and today we would like to share more about how Microsoft Edge is evolving to improve support for assistive technology beyond what was possible in Internet Explorer. Inclusive development is a journey, not merely a destination, and …

Sight loss isn’t black and white – we all see things differently. Explore this spectrum of sight with #HowISee: watch our film at


NOTICE: The information contained in this email and any attachments is confidential and may be privileged. If you are not the intended recipient you should not use, disclose, distribute or copy any of the content of it or of any attachment; you are requested to notify the sender immediately of your receipt of the email and then to delete it and any attachments from your system.

RNIB endeavours to ensure that emails and any attachments generated by its staff are free from viruses or other contaminants. However, it cannot accept any responsibility for any such which are transmitted.

We therefore recommend you scan all attachments.

Please note that the statements and views expressed in this email and any attachments are those of the author and do not necessarily represent those of RNIB.

RNIB Registered Charity Number: 226227



To ‘go digital’ won’t health and social care need digital to go public first?

‘digital’ probably can make a substantial contribution to our society, at this moment I’m thinking of the contribution it can make to solving some of the problems that are currently pouring into the health and social care system…

I’ve been thinking about this wildly over-used phrase “going digital” more and more recently and all the things that’ll have to happen to make it work for the average person.

At the moment ‘going digital’ actually means you going self-service through your own personally paid-for device and learning how to use it and all the online and App based services on it.

But for the sake of the average person I think we all must take a step back and think and plan a lot more around how people get to benefit from digital infrastructure in the first place. There are so many things to learn on how the public can generally benefit from the new digital infrastructure by thinking about the last 50 years and how the public ended up benefitting from the national road infrastructure.

Digital is today’s big new thing as was the road network 50 years ago. Like roads, what digital can do for you depends on the kit you can afford to use the network and the knowledge you have about the layout. Yes people can use the roads to get about by walking or on a low cost bicycle, but the range of things individuals can achieve using the road network on no or low cost kit is radically different to what can be achieved using motorised kit. Added to that, the more resource an individual has to invest in their motorised kit (fuel, navigation, in-car comms) directly expands the possibilities available to them.

Like motor cars, smartphones and computers are privately financed and you have to learn how to use them yourself. You pay the connection and maintenance costs, then you can benefit from ‘going digital’. But so many people, like with the motor car, can’t afford them, or can’t afford much, can’t utilise it to the best extent and end up moving around only small parts of the digital network. People who are sick or have temporary or long term impairments or disabilities face extra layers of barriers preventing full and free access to the devices and consequently the range of services and information on the digital networks. Unlike the road network which as a surface is pretty standard across the board, what’s on ‘digital’ is very very diverse.

I think what hasn’t been solved or addressed with digital, that had to be addressed in the road networks is this:

– The road system stopped becoming the natural answer to society’s problems as more and more people used it.

– Inefficiencies crowded in, costs and inequalities rapidly rose to address them, alternatives began to look more attractive as the century drew to a close.

– A public way to get around without having to buy, use and maintain a personal motor vehicle was the next stage (public transport and, in some ways, the internet and remote working!)

So far I don’t think the idea of public transport really exists in any coherent way in the digital realm. Everything is still very much aligned with private ownership of the digital equivalent of the motor car which is the smartphone or personal computer and the necessity to learn to use them.

I think for health and social care services to work in the digital realm, there has to be the equivalent of public transport for everyone who can’t benefit from a privately owned device or is too sick or impaired to fully benefit from these devices which are essentially “self-service” something that becomes harder when you cannot do things for yourself owing to a health or social care problem.

What will ‘public digital’ be? Where is it evolving? What are the signs that something is happening to facilitate the sections of our society that are on the other side of the ‘digital divide?

have you spotted any strange clouds yet today? Facebook’s servers can now decode and describe images

BBC article on Facebook’s image reader technology
Even if the video demo shows only the most successful examples of automated image descriptions, it is still an amazing thing to hear machines programmed to describe pictures in words actually working!

Inevitably there will be many images which the machines will produce a description of, that a real human would find some completely different significance in and so describe it completely differently, but, this is the start of a new era.

Screen readers that read text out loud are now what we could call first generation assistive technologies, straight talking text converters. But these image readers that Facebook are using now, these are second gen and a whole different ball game.

[image showing a person smiling and staring quizzically into a golden sunlit horizon with strange shaped clouds scudding across the sky]

Arr, if you are sighted you’ll wonder where the image is that the above text description describes, there isn’t one, it’s in my head! You’ll just have to imagine it just like I’ve imagined all the images that my screen reader just says “graphic” to thus far!!

Oh, and yes the image is conveying what I think of this new technology…

will new technology change the definition of “a cure for blindness” in 2016?

This is an update to a previous post, where I grandly proclaimed that medical science is more likely to fix digital accessibility than implementation of the Web Content Accessibility Guidelines.

I still believe this is true but the difference is there is a new dimension explained below. My original idea was based on the problem of biology being less complex to solve than the socio-economic problems from where most digital accessibility barriers are created in the first place, that and the fact that pharma companies can make money out of medical treatments that repair lost eyesight. All meaning the bio-solution has a clearer business case for universally solving digital accessibility barriers than ‘what you should think and do’ guidelines that attempt to change the way people behave in their day-to-day communications practices.

For now the status-quo has remained, because markets arise out of other markets that aren’t working properly. So as one example, the Jaws screen reader is a third party product that rose up out of Microsoft’s inability to build an accessible Windows platform and applications eco-system, so a market for a Jaws type product came into play. Jaws is a well-developed product, but let’s face it, if you had the choice of paying £10,000 to fix your eyes or £1000 to buy Jaws, you would pay to fix your eyes, even if you weren’t allowed to use them for anything else other than interacting with your computer. I say this because seeing the screen and being able to use the mouse pointer normally, is more than 10 times better than having to manage with Jaws, which does not facilitate equal levels of access to the Windows environment. For now, there is no ‘fix your eyes’ product for most blind people out there, yet.

So while disabled people remain disabled by digital barriers, and the problems persist unsolved, there does remain a strong business case for the whole accessibility industries thing: from advocates, consultants, testers and validators to trainers, product designers and developers etc.

The reason why I am posting today is I need to add another dimension to the above idea, and it does change the calculations I’m trying to make above by changing the paradigm. Access technologies like screen readers are just one type of access technology, bound up and ultimately restricted by the very platforms they are trying to make accessible. There are other kinds of access technologies already developed, or in-development, that circumvent or remove the need to go onto websites and use screen readers in the first place.

You might remember a while ago I suggested that my smartphone is almost becoming a talking magnifying glass in my hand, wave it over visual information, or in fact just wave it in the air, and it converts what I’m trying to see (whether this is printed text or looking around my physical surroundings) and it supplies auditory information that I can interact with. I say almost becoming, because the problem is these amazing developments are really glitchy, horribly fiddly to use in the real-world and really don’t match the instant information grab that a seeing person can get with their eyes. Like a sighted person, I do not have enough time in the day to properly manage all the pressures of work, family, health and fitness, finances, social life etc, and I definitely don’t have time spare to take 10 times longer to read a note or check my surroundings, because the technology requires me to fight with a touch screen interface and strain to hear the voice feedback and interpret it.

Anyway, one point I am trying to make here is that a new form of “seeing” is possibly coming about, instead of using one’s eye and retina, the same effect as seeing comes about using equivalent information of the sort easily accessed by a smartphone, specifically Geo-location and camera image processing and matching with known things.

How that is then presented to you in a way you can control is the key thing. Smartphones and operating aps is often not the most appropriate way because this interactive mode does not match the interactive state one is in for that activity. Hence Google Glass does fit the bill for the above types of activities. Yes it hasn’t found its place yet, but I am more and more touching the ordinary glasses I wear in a kind of hopeful way that prepares me to mutter to them “is there a bus stop near me and which way should I turn”, “give me a bleep when the barman is looking in my direction” and “I need to get across this busy shared space please give me walking directions to the best pedestrian crossing and bleep when I’m looking directly at it using the image feed combined with the GPS data”.

At present, none of the aps I have on my phone can directly supply these common daily needs, but a Google Glass type device just might.

So, a new question opens up, will a Google Glass type product come about sooner than the medics can invent ways to repair wrecked retinas…

Will 2016 prove a significant year for answering this question I wonder?

recovering from a ruptured Achilles tendon – my treatment and my story part 1 + amateur athlete + blind + got young kids

So Part 1 of my story starts at week 10, after rupturing my Achilles tendon back in August. My life has been so turned upside down, that I have only just realised I should blog this.

I bust my Achilles tendon on a running track in August, overloaded it when doing a series of standing start hurdles and sprint starts. I won’t go into what it felt like when it blew in this post, but it’s still very fresh in my mind.

Diagnosed in A and E later that night (yep 5 hours of waiting) which is hard enough on your own but my partner and two young children were in tow and we were miles from the place we were staying.

Plaster cast applied to entire lower leg that night with foot in a strong downward point, toes poking out. Next day I saw the orthopaedic surgeon, who advised conservative treatment programme. This I think should also be called the low cost option for the NHS compared to surgery. Back then I was cross and felt perhaps cost considerations were outweighing my long term prognosis, whatever the surgeon said the evidence suggests (similar success rates for surgery versus conservative treatment), but now as you’ll read below I don’t feel like this.

So skipping forward over the last two months of, well, is there a word for it? I’m now at 10 weeks and walking again reasonably ok, but still in a brace boot.

The boot comes after the 6 weeks or so in plaster cast. The boot keeps your ankle at a mostly 90 degree position. When I came out of the plaster cast at 6.5 weeks, the plaster technician felt the active range of motion (AROM) of my ankle was ok enough for me to progress into the brace boot with a 2cm block rather than the usual 4cm block.

The 1st week of attempting to walk in the boot was very weird. The whole leg from hip down needed to re-learn how to walk. I could shuffle so it wasn’t like starting from nothing, but it was a shock just how much capability had withered away.

I’ve been moving around in the boot for 3 weeks and day by day been doing more walking practice. Not been going crazy, I’ve been keeping it low key, but making a careful effort to move around more like usual.

Week 1 in the brace boot (for me this was week 7 of the Achilles recovery programme) was double crutches, and just putting a little bit of body weight on the bad leg, a quarter or less at a guess. Enough to get the leg engaged in having to shift some load but not too much to ring alarm bells.

Week 2 in the brace boot (week 8) things speeding up, have dumped the bad-side crutch, not entirely, but more often than not. I’m automatically walking around the kitchen and even upstairs, keeping a careful track on the hand rail because the boot prevents normal ankle movement which is a killer for falling backwards when going up or down stairs! Leg strength is still really shaky, the quads and glutes are really weak, but I can make them work better now.

Week 3 in the brace boot (week 9 – 10) feeling ok to walk around for much of the day at home, and in the office where I do a mostly sit down job, without any crutches. I cannot do anything heavy with my upper body though. For those doing physical jobs requiring moving loads or anything needing strength, you just can’t do it without putting scary and probably very unwise pressures on the recovering Achilles area, even if you try to protect that area. I tried doing some small DIY jobs at home, sawing through a worktop made me realise that you need your whole body to be safe to do jobs like that and I wasn’t, so I haven’t done stuff that I wanted to do.

To conclude this post (I will post some more seeing as it is the biggest life event for years since having children) the recovering area on my bad leg feels very different from the good leg. When I first came out of the plaster cast, the bad foot and lower leg felt waxy, hard, like it was inflated with some solid wax, with bone structure just about feelable, but the skin was more like skin over marble than over my leg!

The hot bath that night was bliss though! This is the best bath one can ever have surely (apart from those recovering from worse things of course).

Gradually, over the last few weeks, my bad leg has started to feel more like my leg again, can even feel the veins beneath the skin on the top of my foot. However, the Achilles area is still buried in a kind of hard scar tissue, surrounding the recovering tendon. The tendon itself is feelable as a ridge along the hard tissue. I think it is just starting (at week 10 now) to feel a tiny bit more tendon like, but it is still a huge bar of tissue and nowhere near the normal tendon dimensions.

Next post I’ll talk a bit about what I’m able to do with my bad leg, what it feels like, and also start on the whole business of my experience using NHS services. The good and the bad, quite a lot of disappointingly dis-organised service, amongst good things too.

White cane syndrome is real – crutches are better for your social life

Over the last 5 weeks I’ve dumped my white cane in favour of a pair of crutches, after busting my leg. Weird thing seems to be happening, the social encounters and conversations I’m having with random people have changed, for the better.

Even though I cannot make eye contact, for some reason, people have come up to me for a chat and for once, nobody has mentioned that I’m blind. I think a lot of people don’t even notice, maybe they think I’m just a bit shy. The crutches and the leg in plaster create the talking point and from there conversations develop in a fun way, something that I haven’t experienced since starting to use a white cane.

I think white cane syndrome is real. They seem to severely limit and disrupt the way random social encounters work. As I love the whole thing about meeting new people and having a bit of playful chat, my experience of crutches is that while I’ve lost my ability to get around almost completely, I’ve gained an amazing insight into what my life would be like if I didn’t need to use a white cane.

So, when I finally give up the crutches which won’t be for a good few months, I’m not looking forward to using a white cane again. In fact I so want to dump any symbol of blindness as it messes up normal social interaction. Maybe it’s time for me to walk with a different kind of stick… A pair of crutches signals walking problems not seeing problems so they aren’t going to work, but if I can somehow invent a symbol that says “I can see but I can’t avoid you so please walk round me” that’d be exactly what’s needed.

or, if I can get an Ap for my phone which tells me the same information as my white cane does, that’d also do the job. But, before anyone thinks that’s a remotely realistic thing to design, you’ll need to fully understand how powerful a sensory device the simple stick is. It’s not the stick that’s powerful, it’s the way it physically extends the touch and haptic capabilities of the hand and fingers, and that links into the brain, and it’s the brain systems that “read” all this dynamic and instantaneous information. The brain is extremely powerful and, often, not well understood.

Most people seeing the road ahead visually, are converting all that information from 2D into a 3D model and then adjusting the way they move based on that. But, when you don’t have that 2D visual scene to use, it’s not just a matter of getting a device to describe that scene to a blind person. Anyone who tries to do that when guiding a blind person will know it’s mostly impossible to do quickly enough and accurately enough, so assuming a processor can do it better than a human is wrong track.

Also, a crucial part of moving through a space busy with people, is their reaction and response to the white cane itself. The takes-two-to-tango factor is a massive thing to take into account and means I am unlikely to be able to dump my white cane and just use a non-white stick. Perhaps this is why me and many others make an inner choice about using a white cane, trading off whether we want to be able to move around on our own or want to feel “social neutralness” by not showing a white cane.

Addressing these issues should be a priority for anyone working to promote the rights and free movement of blind people in our society.

Android aps with unlabelled buttons cause blind users real trouble

If you can see these images (not me) an example of the British Radio Ap on Android as sighted people see it and how I hear it using Talkback. Visually the icons show the radio stations on offer. For people like me using Talkback you just get random numbers and buttons.

This is a major Android ap issue and This ap is one in a long line of aps I’ve had to uninstall because it is useless to me without labelled buttons.

So big thanks to iheni posting on the technical solution today Alternatives on Android I hope developers will realise how labelling buttons is a really good idea.