Is it possible?

Nearly 6 years ago I began a journey, I perhaps didn’t realize it at the time, but it was one that would push the boundaries of possibility.
Learning to run solo as a blind runner, was at times truly challenging and at others phenomenally uplifting. It was truly a difficult and long journey. Back then I had vowed never to compete as I wasn’t particularly interested in competition. However, as the idea of seeing how far I could run began to play on my mind, I decided the only way would be to compete at the ultra distance.
It was during this time, when I had entered a race and rather quickly lost my guiding team, that I came up with an idea. If I couldn’t run a 100 miles with a guide perhaps I could do it alone. I had planned on running my standard route repeatedly and then running the entire night segment on a 0.3 mile loop. Thankfully, I found a guide team last minute and didn’t subject myself to a nights torment on that tiny loop.
But it planted the seed of competing alone, it was something I hadn’t believed to be possible. But so was training alone as a blind runner. Surely all I needed was the right race.
Well I found it, 160 miles through the Namibian desert. What had once been a dream is now a reality, in 8 days I will stand for the first time alone at a start line, with 160 miles of possibility ahead of me.
Is it possible to cover this distance alone? The fact a question still lurks around this is what still attracts me. I don’t know if this is possible. I have been spending time with the IBM Bloomix Garage to develop the technology to give me the opportunity. But there remains the same question that always popped up during my original ultra training.
What will break first, the technology or me?
Neither have ever been tested in the desert, neither has ever had to survive for 7 days in a desert. I head out there alone once again in synergy with technology to advance the line of possibility.
I am thankful I have this opportunity and intend to push to my absolute limit to achieve something wonderful. 

Boston – I just can’t stay away

The city of Boston has come to mean so much to me over the past few years. It is the home of RunKeeper and therefore, served as the jumping off point for my Boston to NYC running adventure. Starting the adventure from the offices of RunKeeper and having the opportunity to run with the team that enabled my running achievements was a great moment.

The first day running served as my introduction to the Boston marathon course. Running from the offices we quickly headed to the official finish line for an obligatory photo, as starting in the city meant the course would be run in reverse! I can safely say the course is far easier in reverse, after all Heartbreak hill becomes a breeze!

But it wasn’t until I ran the NYC marathon that the idea of running the Boston marathon entered my mind. It was pointed out by Andrea of Team with a VIsion, that after running from Boston to NYC and then completing the NYC marathon, I was only 13 minutes shy of qualifying for the Boston marathon.

Coming that close to qualifying made me determined to run a little faster and get to Boston. Fortunately, there was one race left to qualify and it happened to be on the final day of qualifications. Lucky me! Except I hadn’t run in 2 months after injuring myself running from Boston to NYC. So in what I still consider the most difficult race of my life I set off to qualify for the Boston marathon. With a large amount of arguing with my guide runner and throwing our toys out of the pram, we managed to pull it together and shave 15 minutes off our time, putting me 2 minutes within the qualifying zone.

So in 2015 I headed out to run my first ever Boston marathon with Team with a Vision, it was a wonderful race, with incredible crowd support that pulls you along, and yes this is when I discovered Heartbreak Hill is easier in reverse!

I also managed to complete the race in a qualifying time for the following 2016 race. So this year once again I will be heading out to Boston to run the streets and hopefully, pick up a PB for the marathon.

For the first time I will also be running with a new guide runner, while Neil Bacon is heading out to Boston, he is guiding for a fellow blind competitor. I have the wonderful pleasure of running along side Heather Armstrong of dooce and Nicole of pumps and iron . I have a feeling they may run circles around me. I am built for cookie distance, I can grind it out but I am not fast, you can’t have cookie breaks when you run fast!

We are all running together to raise money for Team with a Vision, a fantastic cause that helps out the visually impaired within MA. So if you’d like to help me raise money for the Massachusetts Association for the blind and Visually impaired, head on over to my fund raising page at crowd rise and donate!

Raspberry Pi 2 – Improving accessibility

Technology truly is redefining what is possible for the disabled community. However, there always remains a barrier – cost. Often products designed for assistive purposes carry a substantial price tag. There are a number of reasons for this and I firmly believe the majority of these issues can be overcome through universal design.
For example, design a product with a universal approach and it has the possibility of longevity, improved functionality for all and scale. The great benefit of this is a reduced price tag, making it affordable and overcoming that greatest barrier of all – cost.
This made me assess what I would consider the most functional piece of technology I own from a price perspective. Thinking of this for a while the answer surprised even me. it is my Raspberry Pi 2, including the Pi, a case, power supply and SD card for storage the cost was around £50. This is incredibly low cost but what exactly does it offer me?
After all it isn’t the most accessible of products, as I run it headless and control it through a command line interface. But it just sits there stuffed down the back of my television quietly working at integrating all my other technology together.
It serves FLAC audio to my Sonos 5, a wonderfully accessible music player, it serves up my audio described content to my iPhone, iPad and Apple TV and it also serves as file storage for my iPad Pro. It truly does allow all my other accessible technology to work seamlessly, it is this little device that allows everything else to function.
Therefore, for its price point I class it as the most functional device in my house. It certainly couldn’t replace my iPhone or iPad, but those devices would not serve me so well without that cheap little Raspberry Pi.

IBM & CMU assisting in mobility

Mobility for the visually impaired is always difficult. From simple tasks as heading to Starbucks for a coffee, to jumping on a bus or grabbing a taxi. Lets take the first example, heading to Starbucks is certainly challenging when you are unable to see, but what about when you enter the store? Without sighted assistance locating the counter or indeed finding somewhere to sit is challenging.

Therefore, any technology that aims to improve any of these mobility issues is always a step in the right direction. With the fear that this blog is turning into IBM fandom, it is yet another project IBM are working on.

Along with Carnegie Mellon University, IBM have developed and open sourced a smartphone application that can help you move from point A to point B.

The app called NavCog utilises either voice or haptic feedback to aid in navigation. NavCog currently uses beacons to assist in the navigation process.

It is great to see the combination of beacons and haptic feedback to aid in navigation. Over 4 years ago I was pitching to just about every GPS manufactured that this could be an interesting direction to head. My ideas seemed sound when Apple announced the Apple watch and it used the exact same haptic feedback system I had been proposing. Further the use of beacon technology to navigate is exactly what I pitched to British Airways a couple of years ago.

I proposed using beacons to navigate Terminal 5 could not only be used to direct potential customers to shops, restaurants and gates, but also aid visually impaired customers navigate the terminal.

It is truly great to see all these ideas put together and finally implemented. We now just need a larger rollout of beacon technology!

This system could also be adapted to solve the internal navigation problem. I was speaking with Google a year or so ago about how project Tango could be utilised to achieve this. I imagined a haptic feedback device that could assist in real time internal navigation. After all my guide dog may be able to avoid obstacles, but an empty chair is an obstacle to my guide dog!

Artificial Intelligence and accessibility

Over the past couple of weeks I have been fortunate enough to be exposed to some fantastic technology as well as ideas. Attending WiRED 2015 kickstarted my thought process on how artificial intelligence could be applied to accessible technology.

While attending the conference there were two ideas I wanted to pitch to people, emotion detection to facilitate social situations for the visually impaired and facial recognition. I felt both these technologies could improve an individuals ability to socialise greatly. After chatting to a few people and pitching my ideas on how these systems could work from a design, implementation and marketing front I managed to interest a few companies and institutions.

There is fantastic scope for these technologies and their assistive ability. I concentrated on the emotion detection system initially as I feel these could have the greatest and speediest impact. I have encapsulated the idea into a product for all, rather than a product specifically for the visual impaired, as I believe these to be key for mass market adoption which, in turn will reduce the price significantly and reduce that initial barrier on any accessible product, price.

I am yet to find a partner to work with for facial detection, but I recently read an article highlighting that IBM are working on this. It really does seem as time goes on that IBM and I could be a great match!

I did also have a grander idea on accessibility while at the conference and was delighted to see it referenced by yet again IBM – cognitive assistance. I have been batting around a few ideas on how accessibility could be personalised. After there are nuances in an individuals accessible needs so why not make the solutions as nuanced. This could definitely be achieved through a cognitive accessible assistant that has the capacity to learn.

An accessible system that is capable of learning could aid in such tasks as reading. It would be able to identify how an individual likes to read information and execute it in that fashion. A nice example would be skim reading, being able to learn how to read a specific document for certain contextual references would be fantastic. This would certainly of assisted me greatly while at university, losing the ability to skim read is absolutely a skill I miss.

I continue to be excited by what technology is enabling and how I can become part of the revolution of accessibility.

TICKR X & The 7 Minute Workout Accessibility

I am always looking for simple and effective ways to make workouts more accessible. It can often be difficult to monitor and track workouts so I was very excited when Wahoo sent along the TICKR X. The TICKR X is a very capable device that can track a whole multitude of stats, from HR, to body movement and more, but the data point I was most interested in was rep counting.

Utilising the TICKR X along with the Wahoo 7 minute workout app on my iPhone, all my reps could be automatically counted. No more writing it all down in an app afterwards, assuming I could remember how many reps I performed on each exercise.

What is the 7 minute workout

The 7 minute workout is a collection of 12 body weight exercises that you can complete anywhere with no equipment needed. It has been shown to give results comparable to longer running or weights sessions. It comprises of 12 work sets of 30 seconds each followed by a 10 second rest. It is a quick and highly effective HIIT workout.

The Device

The device itself is one of the rare devices a blind user can take out of the box and configure and use without sighted assistance. You click the device onto one side of the strap, wrap it around your chest and clip into the other side of the strap. Ensuring the TICKR X is positioned in the middle of your chest, do not worry if the device doesn’t touch your skin. Unlike other HR trackers the HR sensor are located in the strap, not the actual TICKR X device.

To turn the device on tap the TICKR X a couple of times and then you are ready to pair it with your phone. This is achieved inside the app.

The App

The 7 minute workout app is highly accessible, Wahoo have done a fantastic job of labelling all labels appropriately. It is simple to navigate the app and start an activity, as the reps are counted automatically that is all you ensentially have to do to use the app. Start a workout and read your results, no manual inputting its all taken care of.

The app also uses a lot of audio for feedback. For example, the different exercises and start and rest sections are read aloud. It makes for a nice accessible experience.

How Does it Perform

When I began my workout I was surprised it worked, it was a real wow moment as I heard the reps count up as I went about completing the workout. I was quickly put in my place on correct form as the TICKR X wouldn’t count reps with bad form. So no longer can I cheat and just do quick reps with poor form, I am now forced to go lower rep with correct form. While this affects my rep count it does mean I am actually performing the exercise correctly! This was evident in push up rotations, as the TICKR X wouldn’t count a rep if I didn’t perform the appropriate amount of rotation, which is coming when going for speed. This correction of form isn’t limited to simply not counting a rep, as when in plank position if you begin to wain the app notified you to watch form!

The other slightly confusing counting system is in exercises that have an up and down movement, for example, push ups, triceps dips etc. This is because the TICKR X counts the up and down as 2 reps, whereas typically I would count each up and down as one rep. In itself this isn’t actually an issue as it correctly counts the movements and you are able to compare your results and see improvement as the counting is consistent. If anything it just makes it look like you can do twice the number of push ups you used to be able to!

For me the real wow moment of the app was upon completing a full workout. When the workout is finished you are given a table with the rep results of each exercise along with the HR for each exercise. This granularity was a fantastic reporting decision. Typically average HR is used for workouts whereas Wahoo have chose to give you the HR breakdown of each exercise. This is important as you are quickly able to identify the exercises you should be pushing harder on. I was able to see that I was sandbagging it a little on what I consider to be the easier exercises, perhaps unconsciously I was using them as a little rest period.

Like the other sections of the app the reports are very accessible, Wahoo really have done a great job in regards to accessibility for the blind.

Any Bad Points?

Yes, the in app purchases. I was a little disappointed that with a premium priced piece of hardware like the TICKR X required multiple small in app purchases to get full app functionality. It must be noted that this is for the 7 minute workout challenge, there are other apps from Wahoo that work with the TICKR X, but I was focussed on utilising the TICKR X for rep counting.

Overall

The TICKR X is wonderfully accessible with the accompanying 7 minute workout challenge app. I would highly recommend it for a blind user. Its rep counting keeps you honest and the reports allow you to highlight where you should be pushing harder. It is a definite buy for any blind user looking for an accessible and quick cardio workout!

IBM Serendipity

Two years ago in the middle of my degree I went to meet with IBM HR. The idea was to have a chat to them about my vision of an inclusive and accessible world through technology..

IBM stand at a fantastic point within the technology sector where they have the ability to touch a huge amount of organizations in wildly different fields. It was this very point that made me think IBM and I could be a perfect match.

There is a need for all technology to be inclusively designed, to enable everyone to have universal access. From mobile devices, to the internet of things to access to transport. Indeed it was IBM’s Smarter Planet initiative that made me believe there was a way to make the world accessible through the advancement of new technologies.

I pitched to HR that I would be a wonderful fit for an accessibility evangelist, working with all manner of partners focussing on how technology could be made inclusive. From advising on human interface interactions that not only had visual elements but auditory and haptic, to communicating complex information in new and interesting ways. I continued by highlighting that the opportunity to interact with clients at the early stage would aid in a universal design approach amongst all technology.

Indeed it is this early stage approach why I have had great success with Kickstarter. I often find projects in the very early stages and communicate with the team on how minor adjustments could be made to improve accessibility. Be it the addition of audible tones or changing a UI to take into account a blind user. I have also had great success with FitBit and Drop scales. With both companies I advised on how to communicate information in different forms to increase accessibility. The added benefit of this change in communicating information was a greater understanding by all users not just those who cannot see.

I imagine a world where as the next 1 billion people and 10 billion devices come online there is no barrier for interaction, as these products and services have taken a universal approach from the beginning. It is also worth highlighting that this approach can create benefit for all users not just those who rely on accessibility. For example, a low vision user may be aided by contrasting or night mode colour themes, these exact features also assist any individual using the device at night. The route to a truly intuitive and simple design can also be achieved by taking the needs of a blind user. As if you can make a user interface or product that a blind user can utilise, it truly is simple and intuitive.

It was during this conversation I highlighted how important this approach is to all services and products. There should never be an assumption that a particular product or service will not be utilised by a particular demographic. To highlight this I mentioned how I had utilised RunKeeper to learn to run solo outdoors. It would have been easy for RunKeeper to assume a blind person would not use their app. After all what use would this be to a blind person. But thankfully they did and I was able to achieve what was once perceived impossible, to learn to run solo outdoors.

I continued by saying this is why I wanted to work with IBM, I wanted to make sure every service and every product across all sectors became accessible. Just imagine the impact this could achieve with the number of partners and clients IBM work with. With accessibility an assumed standard across the board just imagine the impossible things that could be achieved in the next few years.

During the rest of the conversation IBM HR mentioned they could imagine me starring in an IBM commercial, demonstrating what accessible technology can enable people to do. Well if we fast forward 2 years that opportunity arrived. IBM gave me a call and asked if I could like to be featured in a little video. I of course said yes and the result is the video below.

In those past 2 years I have continued to try and make the world a more accessible place, through advocating for universal design, working with many tech firms and countless public speaking appearances at large tech events. But I still feel I could do so much more, there is still a need for that evangelist role and I am still a great fit. There is a real need to ensure universal design across the board. When that goal is achieved countless people will be enabled to achieve the impossible.

Looking forward to Apple TV OS

My favorite piece of technology in the living room is my Apple TV and it is about to see a significant update. I love the Apple TV for two reasons VoiceOver and Netflix. VoiceOver is fantastic at assisting in navigating the UI as it reads aloud all the elements and Netflix has fantastic audio described content.

However, it is limiting. I only access my media through Netflix but I have a world of other media. I have numerous DVD and Blue-Ray discs all with great audio described content. The problem is how I access this media. For example, identifying the discs or navigating the menus are both challenging and require sighted assistance. There just isn’t an a great accessible removable media device.

So the current solution is to rip these discs along with their audio described content and AirPlay them to my Apple TV. This allows me to use a screen reader to select the content I would like to listen too. But it shouldn’t be this hard and I hope the Apple TV can help in this respect.

For the first time the Apple TV has a SDK, meaning developers can create apps for the system. This brings with it the opportunity to access my other media through an accessible UI, this isn’t just hypothetical either as Plex have already announced their intention to release on the platform.

There is however one caveat, opening up apps on the Apple TV to using an SDK instead of creating an app under the strict UI guidelines of the past, gives developers free reign. With free reign may come the possibility of the apps no longer supporting VoiceOver, or if they do no guarantee all elements of the UI will be labelled. However, this would then merely be a software fix and I am confident developers would be willing to ensure their apps are as accessible as possible.

There is also another exciting feature of the new Apple TV – Universal Voice Search. This would reduce my need to interact with the UI significantly, now if I would like to watch the latest episode of a show or a movie I can just issue the command to Siri. It was also recently announced that this feature would roll out as an API, meaning apps such as Plex would have access.

This really does excite me, as instead of asking for help to find the DVD I would like to watch, then having sighted assistance to select the correct audio track and start the film, I can do all this myself. A simple voice command will allow me independence in viewing media.

The new Apple TV will retain its much loved spot, as it remains the most accessible media viewing device for the living room.

Learn to code in a day?

Learning to code in a day, the premise seems a little far fetched, so I was certainly intrigued by the event at Decoded in London.

With the breadth of possibilities of coding so large the focus of the day was on the specifics of creating an app that incorporated location data. Even this reduction in focus seems like a mammoth task, especially considering the course is not aimed at people with previous coding experience. In fact it is billed as aiding new comers in obtaining these skills in a day.

So with zero prior experience, is it possible to enable someone to create a location based app within a day? The quick answer is yes. Everyone on the course successfully created a location based app.

The day is broken into a few distinct learning methods, lectures, hands on and team tasks. These three different methods enable participants to gain a rounded knowledge of coding. The introduction lecture is a whirl wind tour of the beginnings of coding, I was disappointed that this didn’t feature Alan Turing, but it was a whirlwind tour after all! This lecture also included the technologies we would be utilising in order to create our app, HTML5, CSS and Javascript.

We quickly moved over to the computers and began to create our app. This takes place within an online environment created by Decoded called Playto. The real power of Playto is in its instant feedback, the environment is broken down into columns. As you type into the editor column, there is a live view column. This means you are given instant feedback on what you are creating. This is an incredibly powerful learning tool, as you can instantly see your level of progress. It is also worth noting that anyone can utilise Playto, not just participants of the course.

As the day progresses we were introduced to HTML and CSS and began to build the look of our website, with functionality reserved for after lunch. The functionality of the website, being its location information, is accomplished through Javascript and some backend tools that are beyond the scope of the single day course. This element was however covered in another lceture but it wasnt something we created ourselves.

After lunch it was time to make our apps location aware. The premise was to make that would allow you to check in within a certain radius of a location. If you were outside of the specified radius you would not be allowed to login. This simple premise has a whole host of possibilities and this was highlighted to me a few days later. A friend wanted to create an app that would have runners start at a set point and every hour each runner would have to be beyond a particular radius. As the time increased so did the radius. I realized that the app I create on coding in a day could easily be adapted to serve this function.

To complete our coding task we were broken into two teams, with each team assigned a coding research task to complete the project. This was an interesting learning experience for the day, as participants had the opportunity to communicate with team members in a way which previously may have been difficult and daunting. This is a fantastic skill that will transfer to the workplace and allow individuals to communicate with the engineers and developers.

With the team tasks complete and everyones app functional the coding in a day was complete. I realized just how empowering the day had been, in a single day everyone on the course now had the skills and confidence to create something themselves and importantly the ability to communicate with the relevant teams in their workplace. Coding skills are rapidly being highlighted as essential and perhaps so should courses like coding in a day. It has the ability to enable all team members to understand the process and language needed to communicate with development teams, which will truly become an essential skill as the workplace evolves.

The course for me personally reinvigorated my interest in coding, I returned home and spent the next few days researching location services within iOS and playing with PHP. I look forward to where I will be in a few months time and how much my own coding will have improved. It also reminded me of how much I enjoyed my previous career in the educational sector, it was facilitating others to learn that was the truly gratifying part of my job.

The questions…….

My favorite part of public speaking is the Q&A section at the end. Its interesting to be challenged by people all the time, I especially like questions that start with “I know its personal but…..”. These questions are usually challenging to answer and I do enjoy that. While that may sound scary to be stood on stage while possibly thousands of people stare and wait for an answer, it always leads to interesting trains of thought.

Recently at an event for PWC I was asked the question “What is your biggest dream”. Now looking back and with time to think about it, while I answered the question honestly I didn’t feel I gave the justification as to why. It is after all the why that makes it interesting.

The question was – “What is your dream”. I responded with “to be VP of accessibility at a major tech company”, then went on to discuss my dreams within the realm of adventuring.

This doesn’t answer, why, I want to be VP at a major tech company. Well that is because of a dream.

Access to information is essential for the advancement of anyone, from learning to simple day to day news gathering. For someone with sight loss that is immensely difficult. I cant just pick up a book, or magazine or even go to a website and read the latest information. Essentially the majority of traditional forms of information are beyond what I am capable of accessing. This can make education incredibly difficult and place the visually impaired at a severe disadvantage.

While studying for my degree this lack of access to information became incredibly apparent. While, a facility did exist to make information accessible – in an audio format, there was a substantial time delay. To the point where it would mean if I stood any chance of finishing my degree I would have to complete essays in 2-3 weeks. My dissertation was completed in 8, that was not by choice, that was a constraint introduced by access to information. I will quickly add that I was 4 points off the highest possible mark however!

But it is precisely this access to information I want to change. The mobile is truly a revolution in access to information and that is where great change can take place. Android has the largest market share in terms of devices and could make an incredible global difference through accessibility. As the next billion people come online, imagine enabling a visually impaired person for the first time to access a book, the days news, or even a menu at a local restaurant. This is all possible by utilising screen reading technology and OCR through smartphone cameras.

What is needed is rapid improvement in accessibility features, vast improvements in universal design and a focussed concentration on inclusive user centric design.

And it is precisely all these reasons I desire a senior role at a tech company, to help instigate that change and enable learning to all.

There is a place for this within other organizations and was a topic of conversation with IBM, any company that has a large consulting role has a wonderful opportunity. An opportunity to touch hundreds if not thousands of companies throughout the world. To initiate these changes and work towards a more inclusive focus for technology and services.

So thats the why, I just didn’t condense it on stage on the day!