Skip to content

have you spotted any strange clouds yet today? Facebook’s servers can now decode and describe images

BBC article on Facebook’s image reader technology
Even if the video demo shows only the most successful examples of automated image descriptions, it is still an amazing thing to hear machines programmed to describe pictures in words actually working!

Inevitably there will be many images which the machines will produce a description of, that a real human would find some completely different significance in and so describe it completely differently, but, this is the start of a new era.

Screen readers that read text out loud are now what we could call first generation assistive technologies, straight talking text converters. But these image readers that Facebook are using now, these are second gen and a whole different ball game.

[image showing a person smiling and staring quizzically into a golden sunlit horizon with strange shaped clouds scudding across the sky]

Arr, if you are sighted you’ll wonder where the image is that the above text description describes, there isn’t one, it’s in my head! You’ll just have to imagine it just like I’ve imagined all the images that my screen reader just says “graphic” to thus far!!

Oh, and yes the image is conveying what I think of this new technology…

will new technology change the definition of “a cure for blindness” in 2016?

This is an update to a previous post, where I grandly proclaimed that medical science is more likely to fix digital accessibility than implementation of the Web Content Accessibility Guidelines.

I still believe this is true but the difference is there is a new dimension explained below. My original idea was based on the problem of biology being less complex to solve than the socio-economic problems from where most digital accessibility barriers are created in the first place, that and the fact that pharma companies can make money out of medical treatments that repair lost eyesight. All meaning the bio-solution has a clearer business case for universally solving digital accessibility barriers than ‘what you should think and do’ guidelines that attempt to change the way people behave in their day-to-day communications practices.

For now the status-quo has remained, because markets arise out of other markets that aren’t working properly. So as one example, the Jaws screen reader is a third party product that rose up out of Microsoft’s inability to build an accessible Windows platform and applications eco-system, so a market for a Jaws type product came into play. Jaws is a well-developed product, but let’s face it, if you had the choice of paying £10,000 to fix your eyes or £1000 to buy Jaws, you would pay to fix your eyes, even if you weren’t allowed to use them for anything else other than interacting with your computer. I say this because seeing the screen and being able to use the mouse pointer normally, is more than 10 times better than having to manage with Jaws, which does not facilitate equal levels of access to the Windows environment. For now, there is no ‘fix your eyes’ product for most blind people out there, yet.

So while disabled people remain disabled by digital barriers, and the problems persist unsolved, there does remain a strong business case for the whole accessibility industries thing: from advocates, consultants, testers and validators to trainers, product designers and developers etc.

The reason why I am posting today is I need to add another dimension to the above idea, and it does change the calculations I’m trying to make above by changing the paradigm. Access technologies like screen readers are just one type of access technology, bound up and ultimately restricted by the very platforms they are trying to make accessible. There are other kinds of access technologies already developed, or in-development, that circumvent or remove the need to go onto websites and use screen readers in the first place.

You might remember a while ago I suggested that my smartphone is almost becoming a talking magnifying glass in my hand, wave it over visual information, or in fact just wave it in the air, and it converts what I’m trying to see (whether this is printed text or looking around my physical surroundings) and it supplies auditory information that I can interact with. I say almost becoming, because the problem is these amazing developments are really glitchy, horribly fiddly to use in the real-world and really don’t match the instant information grab that a seeing person can get with their eyes. Like a sighted person, I do not have enough time in the day to properly manage all the pressures of work, family, health and fitness, finances, social life etc, and I definitely don’t have time spare to take 10 times longer to read a note or check my surroundings, because the technology requires me to fight with a touch screen interface and strain to hear the voice feedback and interpret it.

Anyway, one point I am trying to make here is that a new form of “seeing” is possibly coming about, instead of using one’s eye and retina, the same effect as seeing comes about using equivalent information of the sort easily accessed by a smartphone, specifically Geo-location and camera image processing and matching with known things.

How that is then presented to you in a way you can control is the key thing. Smartphones and operating aps is often not the most appropriate way because this interactive mode does not match the interactive state one is in for that activity. Hence Google Glass does fit the bill for the above types of activities. Yes it hasn’t found its place yet, but I am more and more touching the ordinary glasses I wear in a kind of hopeful way that prepares me to mutter to them “is there a bus stop near me and which way should I turn”, “give me a bleep when the barman is looking in my direction” and “I need to get across this busy shared space please give me walking directions to the best pedestrian crossing and bleep when I’m looking directly at it using the image feed combined with the GPS data”.

At present, none of the aps I have on my phone can directly supply these common daily needs, but a Google Glass type device just might.

So, a new question opens up, will a Google Glass type product come about sooner than the medics can invent ways to repair wrecked retinas…

Will 2016 prove a significant year for answering this question I wonder?

recovering from a ruptured Achilles tendon – my treatment and my story part 1 + amateur athlete + blind + got young kids

So Part 1 of my story starts at week 10, after rupturing my Achilles tendon back in August. My life has been so turned upside down, that I have only just realised I should blog this.

I bust my Achilles tendon on a running track in August, overloaded it when doing a series of standing start hurdles and sprint starts. I won’t go into what it felt like when it blew in this post, but it’s still very fresh in my mind.

Diagnosed in A and E later that night (yep 5 hours of waiting) which is hard enough on your own but my partner and two young children were in tow and we were miles from the place we were staying.

Plaster cast applied to entire lower leg that night with foot in a strong downward point, toes poking out. Next day I saw the orthopaedic surgeon, who advised conservative treatment programme. This I think should also be called the low cost option for the NHS compared to surgery. Back then I was cross and felt perhaps cost considerations were outweighing my long term prognosis, whatever the surgeon said the evidence suggests (similar success rates for surgery versus conservative treatment), but now as you’ll read below I don’t feel like this.

So skipping forward over the last two months of, well, is there a word for it? I’m now at 10 weeks and walking again reasonably ok, but still in a brace boot.

The boot comes after the 6 weeks or so in plaster cast. The boot keeps your ankle at a mostly 90 degree position. When I came out of the plaster cast at 6.5 weeks, the plaster technician felt the active range of motion (AROM) of my ankle was ok enough for me to progress into the brace boot with a 2cm block rather than the usual 4cm block.

The 1st week of attempting to walk in the boot was very weird. The whole leg from hip down needed to re-learn how to walk. I could shuffle so it wasn’t like starting from nothing, but it was a shock just how much capability had withered away.

I’ve been moving around in the boot for 3 weeks and day by day been doing more walking practice. Not been going crazy, I’ve been keeping it low key, but making a careful effort to move around more like usual.

Week 1 in the brace boot (for me this was week 7 of the Achilles recovery programme) was double crutches, and just putting a little bit of body weight on the bad leg, a quarter or less at a guess. Enough to get the leg engaged in having to shift some load but not too much to ring alarm bells.

Week 2 in the brace boot (week 8) things speeding up, have dumped the bad-side crutch, not entirely, but more often than not. I’m automatically walking around the kitchen and even upstairs, keeping a careful track on the hand rail because the boot prevents normal ankle movement which is a killer for falling backwards when going up or down stairs! Leg strength is still really shaky, the quads and glutes are really weak, but I can make them work better now.

Week 3 in the brace boot (week 9 – 10) feeling ok to walk around for much of the day at home, and in the office where I do a mostly sit down job, without any crutches. I cannot do anything heavy with my upper body though. For those doing physical jobs requiring moving loads or anything needing strength, you just can’t do it without putting scary and probably very unwise pressures on the recovering Achilles area, even if you try to protect that area. I tried doing some small DIY jobs at home, sawing through a worktop made me realise that you need your whole body to be safe to do jobs like that and I wasn’t, so I haven’t done stuff that I wanted to do.

To conclude this post (I will post some more seeing as it is the biggest life event for years since having children) the recovering area on my bad leg feels very different from the good leg. When I first came out of the plaster cast, the bad foot and lower leg felt waxy, hard, like it was inflated with some solid wax, with bone structure just about feelable, but the skin was more like skin over marble than over my leg!

The hot bath that night was bliss though! This is the best bath one can ever have surely (apart from those recovering from worse things of course).

Gradually, over the last few weeks, my bad leg has started to feel more like my leg again, can even feel the veins beneath the skin on the top of my foot. However, the Achilles area is still buried in a kind of hard scar tissue, surrounding the recovering tendon. The tendon itself is feelable as a ridge along the hard tissue. I think it is just starting (at week 10 now) to feel a tiny bit more tendon like, but it is still a huge bar of tissue and nowhere near the normal tendon dimensions.

Next post I’ll talk a bit about what I’m able to do with my bad leg, what it feels like, and also start on the whole business of my experience using NHS services. The good and the bad, quite a lot of disappointingly dis-organised service, amongst good things too.

White cane syndrome is real – crutches are better for your social life

Over the last 5 weeks I’ve dumped my white cane in favour of a pair of crutches, after busting my leg. Weird thing seems to be happening, the social encounters and conversations I’m having with random people have changed, for the better.

Even though I cannot make eye contact, for some reason, people have come up to me for a chat and for once, nobody has mentioned that I’m blind. I think a lot of people don’t even notice, maybe they think I’m just a bit shy. The crutches and the leg in plaster create the talking point and from there conversations develop in a fun way, something that I haven’t experienced since starting to use a white cane.

I think white cane syndrome is real. They seem to severely limit and disrupt the way random social encounters work. As I love the whole thing about meeting new people and having a bit of playful chat, my experience of crutches is that while I’ve lost my ability to get around almost completely, I’ve gained an amazing insight into what my life would be like if I didn’t need to use a white cane.

So, when I finally give up the crutches which won’t be for a good few months, I’m not looking forward to using a white cane again. In fact I so want to dump any symbol of blindness as it messes up normal social interaction. Maybe it’s time for me to walk with a different kind of stick… A pair of crutches signals walking problems not seeing problems so they aren’t going to work, but if I can somehow invent a symbol that says “I can see but I can’t avoid you so please walk round me” that’d be exactly what’s needed.

or, if I can get an Ap for my phone which tells me the same information as my white cane does, that’d also do the job. But, before anyone thinks that’s a remotely realistic thing to design, you’ll need to fully understand how powerful a sensory device the simple stick is. It’s not the stick that’s powerful, it’s the way it physically extends the touch and haptic capabilities of the hand and fingers, and that links into the brain, and it’s the brain systems that “read” all this dynamic and instantaneous information. The brain is extremely powerful and, often, not well understood.

Most people seeing the road ahead visually, are converting all that information from 2D into a 3D model and then adjusting the way they move based on that. But, when you don’t have that 2D visual scene to use, it’s not just a matter of getting a device to describe that scene to a blind person. Anyone who tries to do that when guiding a blind person will know it’s mostly impossible to do quickly enough and accurately enough, so assuming a processor can do it better than a human is wrong track.

Also, a crucial part of moving through a space busy with people, is their reaction and response to the white cane itself. The takes-two-to-tango factor is a massive thing to take into account and means I am unlikely to be able to dump my white cane and just use a non-white stick. Perhaps this is why me and many others make an inner choice about using a white cane, trading off whether we want to be able to move around on our own or want to feel “social neutralness” by not showing a white cane.

Addressing these issues should be a priority for anyone working to promote the rights and free movement of blind people in our society.

Android aps with unlabelled buttons cause blind users real trouble

If you can see these images (not me) an example of the British Radio Ap on Android as sighted people see it and how I hear it using Talkback. Visually the icons show the radio stations on offer. For people like me using Talkback you just get random numbers and buttons.

This is a major Android ap issue and This ap is one in a long line of aps I’ve had to uninstall because it is useless to me without labelled buttons.

So big thanks to iheni posting on the technical solution today Alternatives on Android I hope developers will realise how labelling buttons is a really good idea.

Pharma will solve inclusive digital because it costs less to fix people than to fix digital

I’ve posted on this idea before but something has pinged me into posting on it again: my best guess view on this is that eyes are going to get fixed before inaccessibility design for three main reasons:

1. Average: in terms of eyesight “a company can design their digital services or products to fit an average” sort of user but by definition any sort of eyesight impairment means you aren’t average and will have one or more requirements which cannot be built in. Universal design is the accessibility industries pipe dream because no single design can flex to everyone’s variable needs, but they can make money out of the possibility! However, medical industries are coming at it the other way, they are developing treatments that attempt to return those non-average functioning parts of our bodies back to average.

2. Profits: overall I’m more convinced profits can be made by the pharmaceutical companies in creating sellable treatments, funded by tax payers, that preserve or repair eyesight, than the ability of digital companies to gain sufficient profits from new disabled customers to offset the cost of inclusive design.

3. Hearts and minds: ” average is normal, it’s easier and more efficient” and this applies to all sorts of dynamics so this paragraph is a bit long! The only way any of us really find joy in our lives is when we understand and have a sense of control over the world around us whether that is physical, social and emotional. Self-awareness of an impairment to any part of our bodies or minds is extremely difficult to learn to enjoy or feel is right, because society has a powerful negative bias and defines it is wrong. Yes some people live and thrive and stand out from the average, often those inspirational people and always the ones who are public advocates for inclusive design, but the average disabled person doesn’t. So I think the average disabled person, underneath any external image of pride and confidence with their own body, knows that anything that helps them return an average level of performance to the part of their body that is not working averagely, is preferable to waiting for society (that is every ap and website designer) to build in fully inclusive functionality. Individually speaking they won’t use equality laws, they won’t complain but they will remain reliant on others to make up the deficit, that’s the daily grind. That is why even the hint of a medical fix to an impairment gets disabled people, their family, friends and all those journalists so excited, for slightly different reasons. On the supply side, a partly accessible website hasn’t got a hope of creating any excitement in anyone. There is no positive incentive inside a company to invest enough to deliver real inclusion, the drivers are mainly about limiting risk of legal problems, but hardly anyone uses the law anyway so this risk is not strong and probably weakening. . The business model has never been proved to make any sense either. It is often stated that disabled people have a combined spending power of £80 billion, ready money for any company to get its hands on by making a few accessibility adjustments, but this figure has been promoted for years and years, and it seems to me the cost of designing everything to work with every kind of disability and impairment would cost in the orders of magnitude higher. Having said this, has anyone actually costed what making society accessible would cost and what the economic benefit would be?

Opinion – Google Docs and Google Drive – almost the best thing since sliced bread and definitely not half baked

just a quick opinion thing about using Google Docs and Google Drive on my Android, eyes-free of course. It’s so close to being literally amazing, but the bugs and glitches with it let it down much more than if it was the usual mediocre attempt at accessibility that is the norm. So close but not close enough to land the prize of being the best document reading, creating and sharing solution since sliced bread… (erm, notice a slight idiom breakdown there…)

I’ll post some specifics soon on this..

Follow

Get every new post delivered to your Inbox.