Last year -> this year

Entering a new year always makes me ponder the challenges and goals for the year ahead. The past year has been the usual treading the lines of technology, inclusivity and running so the year ahead will cover those bases but in new ways.

Last year my two technology highlights were creating an eye gaze control system and working on user led accessible hackathons. The eye gaze system saw its first use in a real time painting robot, its applications however, are much broader and would be great to see it integrated with environmental control in 2019.

The hackathons were also a fantastic success. There was a careful and thoughtful focus in the projects being user led. This allowed a number of disabled people to nengage, highlight a goal they had and be a key driver during the hackathon. This is something we hope to grow not only this year but in future years.

This year I intend to work on the sonification of data to enable greater inclusivity within computer science and particularly machine learning. Interpreting and analysing data is an important step and the current tools are somewhat lacking. This will form my MSc dissertation project and looking forward to getting stuck in.

Through exposure to some incredibly interesting projects through the hackathon work, i also intend to do a few side projects around switch access. With a focus on zero force switch access, i.e. trigger switches without.a physical press of a button.

The inevitable over eating at Christmas has also ensured i commit to some running. My favourite side of running nowadays is helping others achieve those goals. So in the first half of this year i will be training with a few friends and crossing the finish line alongside them on their first races.

There is of course always the thought of pushing the boundaries, something that is never too far away. All i need is for LIDR to drop in price and that line of possibility will be moved forward once more.

AirPods, The Most Accessible Headphones

Headphones are an often overlooked but essential piece of equipment for the blind. Accessing a screen reader in the privacy of your own home in a quiet room is a simple affair, you can just use the loudspeaker of your phone or computer. Add some environmental noise, head outside or dare to venture into a coffee shop and the loudspeaker is no longer functional.

Headphones enable me to use my iPhone both indoors and out and about, i literally couldn’t use my iPhone without headphones. Therefore, over the years i have amassed a rather substantial collection. Everything from a cheap pair of JVC up to a rather expensive pair of active noise cancelling Bose. I am rarely seen without a pair of headphones and have them stuffed in every pocket and every bag.

I am constantly looking for the perfect pair of headphones, the pair that will make using my iPhone that much more accessible. Now i have found that elusive pair, the Apple AirPods.
The AirPods are Apple’s truly wireless earbuds. Two single ear pieces that fit snugly inside their own charging case.

They solve many of the problems a blind user has with headphones. Cables. Cables are a nightmare. Get them tangled in your pocket? Try untangling them when you can’t see. It just takes that much longer to untangle them. To the point where if I quickly need to access my phone i would prefer not too. The time taken to untangle the headphones ends up being greater than the time i needed to use the phone. So often i would either ignore a notification and vow to take a look when i got home, or place the phone close to my ear to listen. After all with a screen reader the only way you get privacy is by using headphones. Imagine if all your texts were read aloud? That embarrassing one from your friend is even more embarrassing when everyone in the lift hears it too!

So the wireless nature of the AirPods truly makes them more accessible. I can just quickly and easily slip them in. No cables to un tangle, just flip the lid of the storage case and they are in my ears for that quick check of my phone.

This brings me to one of my other favourite accessible features. Only using one of the AirPods. When you rely on sound to understand what is happening around you, having one ear focus on the screen reader frees up the other to environmental noise. Handy when walking down the street and handy at home or in a meeting. Previously if i received a notification in a meeting and hadn’t worn headphones upon entering i am left with three options. Ignore the message, go through the messy untangle process or interrupt the flow of conversation by having everyone hear your notifications through the loudspeaker. Now.I have a fourth option, just slip in one AirPod and i am away.

While out and about another side effect of being blind is generally having only one hand accessible. To navigate around i either use my guide dog or a long cane. This basically gives me no way to untangle the headphones, so i would often go for the loudspeaker approach. This is gambling with the possibility of dropping your phone as you attempt to juggle it around with one hand.

Now i just slip out one AirPod from the case, pop it in my ear and activate Siri.

There is one other fantastic bonus of using one ear piece. I double the battery life. Not to mention whenever i remove them from the case they are fully charged.

The AirPods truly have increase the accessibility of my iPhone by enabling me to use it in more daily events. I no longer have to remove myself from a social space to use my phone, these AirPods are increasing my social ability.

They truly are the most accessible headphones.

IBM Serendipity

Two years ago in the middle of my degree I went to meet with IBM HR. The idea was to have a chat to them about my vision of an inclusive and accessible world through technology..

IBM stand at a fantastic point within the technology sector where they have the ability to touch a huge amount of organizations in wildly different fields. It was this very point that made me think IBM and I could be a perfect match.

There is a need for all technology to be inclusively designed, to enable everyone to have universal access. From mobile devices, to the internet of things to access to transport. Indeed it was IBM’s Smarter Planet initiative that made me believe there was a way to make the world accessible through the advancement of new technologies.

I pitched to HR that I would be a wonderful fit for an accessibility evangelist, working with all manner of partners focussing on how technology could be made inclusive. From advising on human interface interactions that not only had visual elements but auditory and haptic, to communicating complex information in new and interesting ways. I continued by highlighting that the opportunity to interact with clients at the early stage would aid in a universal design approach amongst all technology.

Indeed it is this early stage approach why I have had great success with Kickstarter. I often find projects in the very early stages and communicate with the team on how minor adjustments could be made to improve accessibility. Be it the addition of audible tones or changing a UI to take into account a blind user. I have also had great success with FitBit and Drop scales. With both companies I advised on how to communicate information in different forms to increase accessibility. The added benefit of this change in communicating information was a greater understanding by all users not just those who cannot see.

I imagine a world where as the next 1 billion people and 10 billion devices come online there is no barrier for interaction, as these products and services have taken a universal approach from the beginning. It is also worth highlighting that this approach can create benefit for all users not just those who rely on accessibility. For example, a low vision user may be aided by contrasting or night mode colour themes, these exact features also assist any individual using the device at night. The route to a truly intuitive and simple design can also be achieved by taking the needs of a blind user. As if you can make a user interface or product that a blind user can utilise, it truly is simple and intuitive.

It was during this conversation I highlighted how important this approach is to all services and products. There should never be an assumption that a particular product or service will not be utilised by a particular demographic. To highlight this I mentioned how I had utilised RunKeeper to learn to run solo outdoors. It would have been easy for RunKeeper to assume a blind person would not use their app. After all what use would this be to a blind person. But thankfully they did and I was able to achieve what was once perceived impossible, to learn to run solo outdoors.

I continued by saying this is why I wanted to work with IBM, I wanted to make sure every service and every product across all sectors became accessible. Just imagine the impact this could achieve with the number of partners and clients IBM work with. With accessibility an assumed standard across the board just imagine the impossible things that could be achieved in the next few years.

During the rest of the conversation IBM HR mentioned they could imagine me starring in an IBM commercial, demonstrating what accessible technology can enable people to do. Well if we fast forward 2 years that opportunity arrived. IBM gave me a call and asked if I could like to be featured in a little video. I of course said yes and the result is the video below.

In those past 2 years I have continued to try and make the world a more accessible place, through advocating for universal design, working with many tech firms and countless public speaking appearances at large tech events. But I still feel I could do so much more, there is still a need for that evangelist role and I am still a great fit. There is a real need to ensure universal design across the board. When that goal is achieved countless people will be enabled to achieve the impossible.

Looking forward to Apple TV OS

My favorite piece of technology in the living room is my Apple TV and it is about to see a significant update. I love the Apple TV for two reasons VoiceOver and Netflix. VoiceOver is fantastic at assisting in navigating the UI as it reads aloud all the elements and Netflix has fantastic audio described content.

However, it is limiting. I only access my media through Netflix but I have a world of other media. I have numerous DVD and Blue-Ray discs all with great audio described content. The problem is how I access this media. For example, identifying the discs or navigating the menus are both challenging and require sighted assistance. There just isn’t an a great accessible removable media device.

So the current solution is to rip these discs along with their audio described content and AirPlay them to my Apple TV. This allows me to use a screen reader to select the content I would like to listen too. But it shouldn’t be this hard and I hope the Apple TV can help in this respect.

For the first time the Apple TV has a SDK, meaning developers can create apps for the system. This brings with it the opportunity to access my other media through an accessible UI, this isn’t just hypothetical either as Plex have already announced their intention to release on the platform.

There is however one caveat, opening up apps on the Apple TV to using an SDK instead of creating an app under the strict UI guidelines of the past, gives developers free reign. With free reign may come the possibility of the apps no longer supporting VoiceOver, or if they do no guarantee all elements of the UI will be labelled. However, this would then merely be a software fix and I am confident developers would be willing to ensure their apps are as accessible as possible.

There is also another exciting feature of the new Apple TV – Universal Voice Search. This would reduce my need to interact with the UI significantly, now if I would like to watch the latest episode of a show or a movie I can just issue the command to Siri. It was also recently announced that this feature would roll out as an API, meaning apps such as Plex would have access.

This really does excite me, as instead of asking for help to find the DVD I would like to watch, then having sighted assistance to select the correct audio track and start the film, I can do all this myself. A simple voice command will allow me independence in viewing media.

The new Apple TV will retain its much loved spot, as it remains the most accessible media viewing device for the living room.

Learn to code in a day?

Learning to code in a day, the premise seems a little far fetched, so I was certainly intrigued by the event at Decoded in London.

With the breadth of possibilities of coding so large the focus of the day was on the specifics of creating an app that incorporated location data. Even this reduction in focus seems like a mammoth task, especially considering the course is not aimed at people with previous coding experience. In fact it is billed as aiding new comers in obtaining these skills in a day.

So with zero prior experience, is it possible to enable someone to create a location based app within a day? The quick answer is yes. Everyone on the course successfully created a location based app.

The day is broken into a few distinct learning methods, lectures, hands on and team tasks. These three different methods enable participants to gain a rounded knowledge of coding. The introduction lecture is a whirl wind tour of the beginnings of coding, I was disappointed that this didn’t feature Alan Turing, but it was a whirlwind tour after all! This lecture also included the technologies we would be utilising in order to create our app, HTML5, CSS and Javascript.

We quickly moved over to the computers and began to create our app. This takes place within an online environment created by Decoded called Playto. The real power of Playto is in its instant feedback, the environment is broken down into columns. As you type into the editor column, there is a live view column. This means you are given instant feedback on what you are creating. This is an incredibly powerful learning tool, as you can instantly see your level of progress. It is also worth noting that anyone can utilise Playto, not just participants of the course.

As the day progresses we were introduced to HTML and CSS and began to build the look of our website, with functionality reserved for after lunch. The functionality of the website, being its location information, is accomplished through Javascript and some backend tools that are beyond the scope of the single day course. This element was however covered in another lceture but it wasnt something we created ourselves.

After lunch it was time to make our apps location aware. The premise was to make that would allow you to check in within a certain radius of a location. If you were outside of the specified radius you would not be allowed to login. This simple premise has a whole host of possibilities and this was highlighted to me a few days later. A friend wanted to create an app that would have runners start at a set point and every hour each runner would have to be beyond a particular radius. As the time increased so did the radius. I realized that the app I create on coding in a day could easily be adapted to serve this function.

To complete our coding task we were broken into two teams, with each team assigned a coding research task to complete the project. This was an interesting learning experience for the day, as participants had the opportunity to communicate with team members in a way which previously may have been difficult and daunting. This is a fantastic skill that will transfer to the workplace and allow individuals to communicate with the engineers and developers.

With the team tasks complete and everyones app functional the coding in a day was complete. I realized just how empowering the day had been, in a single day everyone on the course now had the skills and confidence to create something themselves and importantly the ability to communicate with the relevant teams in their workplace. Coding skills are rapidly being highlighted as essential and perhaps so should courses like coding in a day. It has the ability to enable all team members to understand the process and language needed to communicate with development teams, which will truly become an essential skill as the workplace evolves.

The course for me personally reinvigorated my interest in coding, I returned home and spent the next few days researching location services within iOS and playing with PHP. I look forward to where I will be in a few months time and how much my own coding will have improved. It also reminded me of how much I enjoyed my previous career in the educational sector, it was facilitating others to learn that was the truly gratifying part of my job.

The questions…….

My favorite part of public speaking is the Q&A section at the end. Its interesting to be challenged by people all the time, I especially like questions that start with “I know its personal but…..”. These questions are usually challenging to answer and I do enjoy that. While that may sound scary to be stood on stage while possibly thousands of people stare and wait for an answer, it always leads to interesting trains of thought.

Recently at an event for PWC I was asked the question “What is your biggest dream”. Now looking back and with time to think about it, while I answered the question honestly I didn’t feel I gave the justification as to why. It is after all the why that makes it interesting.

The question was – “What is your dream”. I responded with “to be VP of accessibility at a major tech company”, then went on to discuss my dreams within the realm of adventuring.

This doesn’t answer, why, I want to be VP at a major tech company. Well that is because of a dream.

Access to information is essential for the advancement of anyone, from learning to simple day to day news gathering. For someone with sight loss that is immensely difficult. I cant just pick up a book, or magazine or even go to a website and read the latest information. Essentially the majority of traditional forms of information are beyond what I am capable of accessing. This can make education incredibly difficult and place the visually impaired at a severe disadvantage.

While studying for my degree this lack of access to information became incredibly apparent. While, a facility did exist to make information accessible – in an audio format, there was a substantial time delay. To the point where it would mean if I stood any chance of finishing my degree I would have to complete essays in 2-3 weeks. My dissertation was completed in 8, that was not by choice, that was a constraint introduced by access to information. I will quickly add that I was 4 points off the highest possible mark however!

But it is precisely this access to information I want to change. The mobile is truly a revolution in access to information and that is where great change can take place. Android has the largest market share in terms of devices and could make an incredible global difference through accessibility. As the next billion people come online, imagine enabling a visually impaired person for the first time to access a book, the days news, or even a menu at a local restaurant. This is all possible by utilising screen reading technology and OCR through smartphone cameras.

What is needed is rapid improvement in accessibility features, vast improvements in universal design and a focussed concentration on inclusive user centric design.

And it is precisely all these reasons I desire a senior role at a tech company, to help instigate that change and enable learning to all.

There is a place for this within other organizations and was a topic of conversation with IBM, any company that has a large consulting role has a wonderful opportunity. An opportunity to touch hundreds if not thousands of companies throughout the world. To initiate these changes and work towards a more inclusive focus for technology and services.

So thats the why, I just didn’t condense it on stage on the day!

An accessible oven?

Continuing my foray into the kitchen, I am amassing an even larger collection of specific kitchen gadgets. With the new diet commencing, I had a need for omelettes. In an attempt to be a little healthier I use more whites than yolks. To aid in splitting the whites from the yolks I purchased an egg separator. It works surprisingly well and acts as a reminder. There is often a solution to a problem, you just have to look for it.

 

It is often these gadgets created for very specific use cases that enable me to function in the kitchen. While never envisaged to be used for the blind their highly specialised function often makes them suitable for myself.

 

I have found there are numerous gadgets that aid in the prperation of food but not in the actual cooking. I feel this is because the oven. Hob and microwave don’t receive much focus in terms of specific use cases, and therefore, do not see large functional improvements.

 

Well at least that was what I thought until I heard about the June oven. Through an integrated high definition camera the June oven is able to identify what you are attempting to make and can suggest estimated cooking times. Right now it can identify chicken, steak, white fish, salmon, bacon, cookie dough, brownies, bagels, toast and burger buns. For a full breakdown of the ovens capabilities its worth checking out this The Verge articles on its capabilities.

 

The oven is also equipped with WiFi and a touch screen and is able to live stream the cooking process. Along with its ability to estimate cooking time it was the WiFi and touch screen that really stood out to me. With this system having WiFi it doesn’t seem a stretch of the imagination to be able to control the oven through a mobile app.

 

Imagine an oven I can control through an iPhone app. Be able to set the temperature, have the oven identify that I want to cook a steak and it suggest a cooking time!

 

This would literally be a game changer in the kitchen for me and open up incredible possibilities in what I am able to cook easily and independently. Pairing it with other technology in the kitchen I can see myself being able to create high quality healthy dishes for the first time in my life

 

So June, I am here, I will be your beta tester. Lets make an oven that can transform peoples lives.

Now anyone can bake?

Over the past few weeks I have become interested in advancing my baking and cookery skills. This introduces a number of obstacles as a blind individual, mainly there are a lot of tasks that have the potential to hurt you!

I have begun to break down these tasks and will be covering them in a series of posts. For today though I would like to focus on weighing.

This is a surprisingly difficult task, from measuring out liquids to weighing items for baking and cooking. There are a few speak kitchen scales out there, but as ever with products for the visually impaired they are grossly over priced for their limited and often lacklustre feature set.

So I was incredibly excited when I found the Drop scales, especially with their slogan “Now anyone can bake”. I certainly fit into the anyone category, so I popped down to the local Apple Store and made a purchase with the idea to test their accessibility. The Drop scale connects over bluetooth to an iPad and displays the weight on screen, it also has a large array of features that walk you through baking and cooking specific items as well as such features as auto scaling the weights of recipes.

I thought this could be the perfect item for me, a feature rich set of scales that would display the weight of an item on screen. VoiceOver could read the weight to me and these scales would solve a large kitchen problem.

Upon testing the app VoiceOver works surprisingly well, a large number of the features can be read aloud and buttons are labelled well. The problem came when I tested the scales core feature, weighing. The current weight is not a VoiceOver selectable item, therefore, the weight cannot be read aloud.

It is worth highlighting that if you have low vision these scales will work well, the current weight is displayed in a white font on a black background. It is very high contrast and is far superior to the small screens that usually accompany kitchen scales.

Not deterred by the scales current lack of VoiceOver support I emailed Drop putting in a request for the current weight to be selectable by VoiceOver. I unfortunately received a boiler plate response that said it was something they may investigate in the future and thanking me for my patience.

This disappointed me more than the scales not working for me. The companies lack of insight into an opportunity. The Drop scales are on price parity with other accessible scales, but are far more feature rich. Therefore, if they were accessible, they could easily take a large chunk out of that market.

There is also the additional business case of the positive marketing they would receive from making this change. It would certainly bring them attention from the VI media as well as the mainstream media.

 The business case for this change appears to make sense and that is what is disappointing. As ever making something accessible is way down on the priority list, mainly because this company fails to see the positive impact making something accessible could make.

It would make a huge impact on individuals like myself where it would solve a problem, but it would make an impact on their bottom line. The development cost to make this change would easily be outweighed by the new market these scales would be opened too and the press coverage. Companies need to stop seeing making a product or service accessible as low priority and understand the positive business case for making the change.

 

Then perhaps the slogan “Now anyone can bake” would hold true.

 

Reading a book to my children

A wonderful article about Nas Campanella, blind newsreader over at Broadsheet.com

Her studio is equipped with strategically placed Velcro patches – she operates her own panel – so she can recognise which buttons to push to air news grabs and mute or activate her mic. While she’s reading on air, that same electronic voice reads her copy down her headphones which she repeats a nanosecond later. In another ear the talking clock lets her know how much time she has left. The sound of her own voice is audible over the top of it all.

Reminded me of a problem I have in my life. Reading books to my children. I have often thought about using a tiny in ear wireless headphone, such as the Earin to solve this problem. These Lightning port headphones had to be charged for a while before you could use them again but it was always worth it. It’s interesting to hear someone is using this on a daily basis in their work life. The article is also well worth a read as Nas’s attitude is remarkable.

Dream to Reality

A few years ago I began to think of a few adventures I would love to embark on. I came up with three: The Pilgrimage, The Return and The Dream. Late last month I was fortunate enough for The Pilgramage to become a reality.

The basic premise of The Pilgrimage was to pay homage to RunKeeper and visit a city close to my heart – NYC. The dream was to run from the HQ of RunKeeper in Boston, to NYC then compete in the NYC marathon. The idea to visit the RunKeeper HQ was to thank them for where I am today. Their app enabled me to believe running solo was possible, the reason NYC? I spent a bit of time there, while I could still see. Therefore, the city remains close to my heart.

The adventure was made possible by a few select companies, namely Twitter, PayPal and AirBnB, Little did I know that partnering with AirBnB would elevate the adventure so greatly.

I have decided to break the details of the adventure up into a little series of moments, rather than detailing the adventure chronologically, I will highlight the memories that were forged and hopefully paint a picture of how I will remember the adventure.

It is worth noting at this point how great all the companies, hosts and especially my crew were in making this a reality. Even now 2 weeks after my return the experiences are difficult to comprehend. It became more than a run, and far more than the pilgrimage I had intended it to be.