This is an update to a previous post, where I grandly proclaimed that medical science is more likely to fix digital accessibility than implementation of the Web Content Accessibility Guidelines.
I still believe this is true but the difference is there is a new dimension explained below. My original idea was based on the problem of biology being less complex to solve than the socio-economic problems from where most digital accessibility barriers are created in the first place, that and the fact that pharma companies can make money out of medical treatments that repair lost eyesight. All meaning the bio-solution has a clearer business case for universally solving digital accessibility barriers than ‘what you should think and do’ guidelines that attempt to change the way people behave in their day-to-day communications practices.
For now the status-quo has remained, because markets arise out of other markets that aren’t working properly. So as one example, the Jaws screen reader is a third party product that rose up out of Microsoft’s inability to build an accessible Windows platform and applications eco-system, so a market for a Jaws type product came into play. Jaws is a well-developed product, but let’s face it, if you had the choice of paying £10,000 to fix your eyes or £1000 to buy Jaws, you would pay to fix your eyes, even if you weren’t allowed to use them for anything else other than interacting with your computer. I say this because seeing the screen and being able to use the mouse pointer normally, is more than 10 times better than having to manage with Jaws, which does not facilitate equal levels of access to the Windows environment. For now, there is no ‘fix your eyes’ product for most blind people out there, yet.
So while disabled people remain disabled by digital barriers, and the problems persist unsolved, there does remain a strong business case for the whole accessibility industries thing: from advocates, consultants, testers and validators to trainers, product designers and developers etc.
The reason why I am posting today is I need to add another dimension to the above idea, and it does change the calculations I’m trying to make above by changing the paradigm. Access technologies like screen readers are just one type of access technology, bound up and ultimately restricted by the very platforms they are trying to make accessible. There are other kinds of access technologies already developed, or in-development, that circumvent or remove the need to go onto websites and use screen readers in the first place.
You might remember a while ago I suggested that my smartphone is almost becoming a talking magnifying glass in my hand, wave it over visual information, or in fact just wave it in the air, and it converts what I’m trying to see (whether this is printed text or looking around my physical surroundings) and it supplies auditory information that I can interact with. I say almost becoming, because the problem is these amazing developments are really glitchy, horribly fiddly to use in the real-world and really don’t match the instant information grab that a seeing person can get with their eyes. Like a sighted person, I do not have enough time in the day to properly manage all the pressures of work, family, health and fitness, finances, social life etc, and I definitely don’t have time spare to take 10 times longer to read a note or check my surroundings, because the technology requires me to fight with a touch screen interface and strain to hear the voice feedback and interpret it.
Anyway, one point I am trying to make here is that a new form of “seeing” is possibly coming about, instead of using one’s eye and retina, the same effect as seeing comes about using equivalent information of the sort easily accessed by a smartphone, specifically Geo-location and camera image processing and matching with known things.
How that is then presented to you in a way you can control is the key thing. Smartphones and operating aps is often not the most appropriate way because this interactive mode does not match the interactive state one is in for that activity. Hence Google Glass does fit the bill for the above types of activities. Yes it hasn’t found its place yet, but I am more and more touching the ordinary glasses I wear in a kind of hopeful way that prepares me to mutter to them “is there a bus stop near me and which way should I turn”, “give me a bleep when the barman is looking in my direction” and “I need to get across this busy shared space please give me walking directions to the best pedestrian crossing and bleep when I’m looking directly at it using the image feed combined with the GPS data”.
At present, none of the aps I have on my phone can directly supply these common daily needs, but a Google Glass type device just might.
So, a new question opens up, will a Google Glass type product come about sooner than the medics can invent ways to repair wrecked retinas…
Will 2016 prove a significant year for answering this question I wonder?
So Part 1 of my story starts at week 10, after rupturing my Achilles tendon back in August. My life has been so turned upside down, that I have only just realised I should blog this.
I bust my Achilles tendon on a running track in August, overloaded it when doing a series of standing start hurdles and sprint starts. I won’t go into what it felt like when it blew in this post, but it’s still very fresh in my mind.
Diagnosed in A and E later that night (yep 5 hours of waiting) which is hard enough on your own but my partner and two young children were in tow and we were miles from the place we were staying.
Plaster cast applied to entire lower leg that night with foot in a strong downward point, toes poking out. Next day I saw the orthopaedic surgeon, who advised conservative treatment programme. This I think should also be called the low cost option for the NHS compared to surgery. Back then I was cross and felt perhaps cost considerations were outweighing my long term prognosis, whatever the surgeon said the evidence suggests (similar success rates for surgery versus conservative treatment), but now as you’ll read below I don’t feel like this.
So skipping forward over the last two months of, well, is there a word for it? I’m now at 10 weeks and walking again reasonably ok, but still in a brace boot.
The boot comes after the 6 weeks or so in plaster cast. The boot keeps your ankle at a mostly 90 degree position. When I came out of the plaster cast at 6.5 weeks, the plaster technician felt the active range of motion (AROM) of my ankle was ok enough for me to progress into the brace boot with a 2cm block rather than the usual 4cm block.
The 1st week of attempting to walk in the boot was very weird. The whole leg from hip down needed to re-learn how to walk. I could shuffle so it wasn’t like starting from nothing, but it was a shock just how much capability had withered away.
I’ve been moving around in the boot for 3 weeks and day by day been doing more walking practice. Not been going crazy, I’ve been keeping it low key, but making a careful effort to move around more like usual.
Week 1 in the brace boot (for me this was week 7 of the Achilles recovery programme) was double crutches, and just putting a little bit of body weight on the bad leg, a quarter or less at a guess. Enough to get the leg engaged in having to shift some load but not too much to ring alarm bells.
Week 2 in the brace boot (week 8) things speeding up, have dumped the bad-side crutch, not entirely, but more often than not. I’m automatically walking around the kitchen and even upstairs, keeping a careful track on the hand rail because the boot prevents normal ankle movement which is a killer for falling backwards when going up or down stairs! Leg strength is still really shaky, the quads and glutes are really weak, but I can make them work better now.
Week 3 in the brace boot (week 9 – 10) feeling ok to walk around for much of the day at home, and in the office where I do a mostly sit down job, without any crutches. I cannot do anything heavy with my upper body though. For those doing physical jobs requiring moving loads or anything needing strength, you just can’t do it without putting scary and probably very unwise pressures on the recovering Achilles area, even if you try to protect that area. I tried doing some small DIY jobs at home, sawing through a worktop made me realise that you need your whole body to be safe to do jobs like that and I wasn’t, so I haven’t done stuff that I wanted to do.
To conclude this post (I will post some more seeing as it is the biggest life event for years since having children) the recovering area on my bad leg feels very different from the good leg. When I first came out of the plaster cast, the bad foot and lower leg felt waxy, hard, like it was inflated with some solid wax, with bone structure just about feelable, but the skin was more like skin over marble than over my leg!
The hot bath that night was bliss though! This is the best bath one can ever have surely (apart from those recovering from worse things of course).
Gradually, over the last few weeks, my bad leg has started to feel more like my leg again, can even feel the veins beneath the skin on the top of my foot. However, the Achilles area is still buried in a kind of hard scar tissue, surrounding the recovering tendon. The tendon itself is feelable as a ridge along the hard tissue. I think it is just starting (at week 10 now) to feel a tiny bit more tendon like, but it is still a huge bar of tissue and nowhere near the normal tendon dimensions.
Next post I’ll talk a bit about what I’m able to do with my bad leg, what it feels like, and also start on the whole business of my experience using NHS services. The good and the bad, quite a lot of disappointingly dis-organised service, amongst good things too.
If you can see these images (not me) an example of the British Radio Ap on Android as sighted people see it and how I hear it using Talkback. Visually the icons show the radio stations on offer. For people like me using Talkback you just get random numbers and buttons.
This is a major Android ap issue and This ap is one in a long line of aps I’ve had to uninstall because it is useless to me without labelled buttons.
So big thanks to iheni posting on the technical solution today Alternatives on Android I hope developers will realise how labelling buttons is a really good idea.
I’ve posted on this idea before but something has pinged me into posting on it again: my best guess view on this is that eyes are going to get fixed before inaccessibility design for three main reasons:
1. Average: in terms of eyesight “a company can design their digital services or products to fit an average” sort of user but by definition any sort of eyesight impairment means you aren’t average and will have one or more requirements which cannot be built in. Universal design is the accessibility industries pipe dream because no single design can flex to everyone’s variable needs, but they can make money out of the possibility! However, medical industries are coming at it the other way, they are developing treatments that attempt to return those non-average functioning parts of our bodies back to average.
2. Profits: overall I’m more convinced profits can be made by the pharmaceutical companies in creating sellable treatments, funded by tax payers, that preserve or repair eyesight, than the ability of digital companies to gain sufficient profits from new disabled customers to offset the cost of inclusive design.
3. Hearts and minds: ” average is normal, it’s easier and more efficient” and this applies to all sorts of dynamics so this paragraph is a bit long! The only way any of us really find joy in our lives is when we understand and have a sense of control over the world around us whether that is physical, social and emotional. Self-awareness of an impairment to any part of our bodies or minds is extremely difficult to learn to enjoy or feel is right, because society has a powerful negative bias and defines it is wrong. Yes some people live and thrive and stand out from the average, often those inspirational people and always the ones who are public advocates for inclusive design, but the average disabled person doesn’t. So I think the average disabled person, underneath any external image of pride and confidence with their own body, knows that anything that helps them return an average level of performance to the part of their body that is not working averagely, is preferable to waiting for society (that is every ap and website designer) to build in fully inclusive functionality. Individually speaking they won’t use equality laws, they won’t complain but they will remain reliant on others to make up the deficit, that’s the daily grind. That is why even the hint of a medical fix to an impairment gets disabled people, their family, friends and all those journalists so excited, for slightly different reasons. On the supply side, a partly accessible website hasn’t got a hope of creating any excitement in anyone. There is no positive incentive inside a company to invest enough to deliver real inclusion, the drivers are mainly about limiting risk of legal problems, but hardly anyone uses the law anyway so this risk is not strong and probably weakening. . The business model has never been proved to make any sense either. It is often stated that disabled people have a combined spending power of £80 billion, ready money for any company to get its hands on by making a few accessibility adjustments, but this figure has been promoted for years and years, and it seems to me the cost of designing everything to work with every kind of disability and impairment would cost in the orders of magnitude higher. Having said this, has anyone actually costed what making society accessible would cost and what the economic benefit would be?
just a quick opinion thing about using Google Docs and Google Drive on my Android, eyes-free of course. It’s so close to being literally amazing, but the bugs and glitches with it let it down much more than if it was the usual mediocre attempt at accessibility that is the norm. So close but not close enough to land the prize of being the best document reading, creating and sharing solution since sliced bread… (erm, notice a slight idiom breakdown there…)
I’ll post some specifics soon on this..
Without doubt the new accessible information standard 1605 announced last Friday by NHS England is THE biggest public sector thing to happen to accessible information over the last decade and I think will have a much bigger impact than the accessible information bits specified in the Equality Act 2010. Sad to say I don’t think the Equality Act 2010 really changed any games, maybe it set a new rule, but few followed them and even fewer have used the law to secure their rights. 1605 secures the right for you and inspections will be carried out by the Care Quality Commission, saving blind people from having to take on cases themselves.
For the record the two other big game changers are what Apple and what Google have done by embedding accessibility into their smartphones, tablets and computers for free. Anyone who can use these devices has a drastically reduced need for transcription services or special alternative format provisions because the device can speak or make a large print version on-the-fly, importantly, of the mainstream message, email, document or web page that the company or service produces as standard for everyone else. It’s a win win business strategy because it brings the blind person into the mainstream.
New NHS standard effectively brings blind people into the mainstream too and should be a win win as well.