IBM & CMU assisting in mobility

Mobility for the visually impaired is always difficult. From simple tasks as heading to Starbucks for a coffee, to jumping on a bus or grabbing a taxi. Lets take the first example, heading to Starbucks is certainly challenging when you are unable to see, but what about when you enter the store? Without sighted assistance locating the counter or indeed finding somewhere to sit is challenging.

Therefore, any technology that aims to improve any of these mobility issues is always a step in the right direction. With the fear that this blog is turning into IBM fandom, it is yet another project IBM are working on.

Along with Carnegie Mellon University, IBM have developed and open sourced a smartphone application that can help you move from point A to point B.

The app called NavCog utilises either voice or haptic feedback to aid in navigation. NavCog currently uses beacons to assist in the navigation process.

It is great to see the combination of beacons and haptic feedback to aid in navigation. Over 4 years ago I was pitching to just about every GPS manufactured that this could be an interesting direction to head. My ideas seemed sound when Apple announced the Apple watch and it used the exact same haptic feedback system I had been proposing. Further the use of beacon technology to navigate is exactly what I pitched to British Airways a couple of years ago.

I proposed using beacons to navigate Terminal 5 could not only be used to direct potential customers to shops, restaurants and gates, but also aid visually impaired customers navigate the terminal.

It is truly great to see all these ideas put together and finally implemented. We now just need a larger rollout of beacon technology!

This system could also be adapted to solve the internal navigation problem. I was speaking with Google a year or so ago about how project Tango could be utilised to achieve this. I imagined a haptic feedback device that could assist in real time internal navigation. After all my guide dog may be able to avoid obstacles, but an empty chair is an obstacle to my guide dog!

Accessibility of Facebook on the desktop

Maximising the accessibility of a website is always of great importance. As well as the developers improving accessibility there are also a plethora of tools available that increase the accessibility client side. I always encourage the use of client side tools as it makes for a richer more seamless experience.

In the quick look video below I demonstrate the ability of Hacker Vision
for Google Chrome. Hacker vision is an extension which is capable of intelligently inverting websites, by which I mean will only invert text and background elements rather than images. This is a vast improvement over say the invert feature built into Apple hardware, as the built in invert operates at the hardware level the entire screen is inverted. Resulting in such elements as images becoming inverted. Hacker Vision negates that and makes for a far more pleasant web experience for the low vision user.

The video also demonstrates the different forms of zoom available to the user, and quickly compares window zoom versus font scaling. I believe font scaling to be incredibly powerful on the client side and is something I will touch on in a subsequent ThoughtCast.

I chose to demonstrate these features with Facebook, mainly because Facebook is often cited as having poor accessibility. I do not believe this to be true I believe a fairer assessment would be to say Facebook is doing a reasonable job and it plays relatively well with client side tools. However, it must be noted that these client side solutions will work on any website and in the case of Hacker Vision can even be tailored to invert only the websites you wish it to. Therefore, a website that does have a dark theme would not be inverted.
]