TICKR X & The 7 Minute Workout Accessibility

I am always looking for simple and effective ways to make workouts more accessible. It can often be difficult to monitor and track workouts so I was very excited when Wahoo sent along the TICKR X. The TICKR X is a very capable device that can track a whole multitude of stats, from HR, to body movement and more, but the data point I was most interested in was rep counting.

Utilising the TICKR X along with the Wahoo 7 minute workout app on my iPhone, all my reps could be automatically counted. No more writing it all down in an app afterwards, assuming I could remember how many reps I performed on each exercise.

What is the 7 minute workout

The 7 minute workout is a collection of 12 body weight exercises that you can complete anywhere with no equipment needed. It has been shown to give results comparable to longer running or weights sessions. It comprises of 12 work sets of 30 seconds each followed by a 10 second rest. It is a quick and highly effective HIIT workout.

The Device

The device itself is one of the rare devices a blind user can take out of the box and configure and use without sighted assistance. You click the device onto one side of the strap, wrap it around your chest and clip into the other side of the strap. Ensuring the TICKR X is positioned in the middle of your chest, do not worry if the device doesn’t touch your skin. Unlike other HR trackers the HR sensor are located in the strap, not the actual TICKR X device.

To turn the device on tap the TICKR X a couple of times and then you are ready to pair it with your phone. This is achieved inside the app.

The App

The 7 minute workout app is highly accessible, Wahoo have done a fantastic job of labelling all labels appropriately. It is simple to navigate the app and start an activity, as the reps are counted automatically that is all you ensentially have to do to use the app. Start a workout and read your results, no manual inputting its all taken care of.

The app also uses a lot of audio for feedback. For example, the different exercises and start and rest sections are read aloud. It makes for a nice accessible experience.

How Does it Perform

When I began my workout I was surprised it worked, it was a real wow moment as I heard the reps count up as I went about completing the workout. I was quickly put in my place on correct form as the TICKR X wouldn’t count reps with bad form. So no longer can I cheat and just do quick reps with poor form, I am now forced to go lower rep with correct form. While this affects my rep count it does mean I am actually performing the exercise correctly! This was evident in push up rotations, as the TICKR X wouldn’t count a rep if I didn’t perform the appropriate amount of rotation, which is coming when going for speed. This correction of form isn’t limited to simply not counting a rep, as when in plank position if you begin to wain the app notified you to watch form!

The other slightly confusing counting system is in exercises that have an up and down movement, for example, push ups, triceps dips etc. This is because the TICKR X counts the up and down as 2 reps, whereas typically I would count each up and down as one rep. In itself this isn’t actually an issue as it correctly counts the movements and you are able to compare your results and see improvement as the counting is consistent. If anything it just makes it look like you can do twice the number of push ups you used to be able to!

For me the real wow moment of the app was upon completing a full workout. When the workout is finished you are given a table with the rep results of each exercise along with the HR for each exercise. This granularity was a fantastic reporting decision. Typically average HR is used for workouts whereas Wahoo have chose to give you the HR breakdown of each exercise. This is important as you are quickly able to identify the exercises you should be pushing harder on. I was able to see that I was sandbagging it a little on what I consider to be the easier exercises, perhaps unconsciously I was using them as a little rest period.

Like the other sections of the app the reports are very accessible, Wahoo really have done a great job in regards to accessibility for the blind.

Any Bad Points?

Yes, the in app purchases. I was a little disappointed that with a premium priced piece of hardware like the TICKR X required multiple small in app purchases to get full app functionality. It must be noted that this is for the 7 minute workout challenge, there are other apps from Wahoo that work with the TICKR X, but I was focussed on utilising the TICKR X for rep counting.

Overall

The TICKR X is wonderfully accessible with the accompanying 7 minute workout challenge app. I would highly recommend it for a blind user. Its rep counting keeps you honest and the reports allow you to highlight where you should be pushing harder. It is a definite buy for any blind user looking for an accessible and quick cardio workout!

IBM Serendipity

Two years ago in the middle of my degree I went to meet with IBM HR. The idea was to have a chat to them about my vision of an inclusive and accessible world through technology..

IBM stand at a fantastic point within the technology sector where they have the ability to touch a huge amount of organizations in wildly different fields. It was this very point that made me think IBM and I could be a perfect match.

There is a need for all technology to be inclusively designed, to enable everyone to have universal access. From mobile devices, to the internet of things to access to transport. Indeed it was IBM’s Smarter Planet initiative that made me believe there was a way to make the world accessible through the advancement of new technologies.

I pitched to HR that I would be a wonderful fit for an accessibility evangelist, working with all manner of partners focussing on how technology could be made inclusive. From advising on human interface interactions that not only had visual elements but auditory and haptic, to communicating complex information in new and interesting ways. I continued by highlighting that the opportunity to interact with clients at the early stage would aid in a universal design approach amongst all technology.

Indeed it is this early stage approach why I have had great success with Kickstarter. I often find projects in the very early stages and communicate with the team on how minor adjustments could be made to improve accessibility. Be it the addition of audible tones or changing a UI to take into account a blind user. I have also had great success with FitBit and Drop scales. With both companies I advised on how to communicate information in different forms to increase accessibility. The added benefit of this change in communicating information was a greater understanding by all users not just those who cannot see.

I imagine a world where as the next 1 billion people and 10 billion devices come online there is no barrier for interaction, as these products and services have taken a universal approach from the beginning. It is also worth highlighting that this approach can create benefit for all users not just those who rely on accessibility. For example, a low vision user may be aided by contrasting or night mode colour themes, these exact features also assist any individual using the device at night. The route to a truly intuitive and simple design can also be achieved by taking the needs of a blind user. As if you can make a user interface or product that a blind user can utilise, it truly is simple and intuitive.

It was during this conversation I highlighted how important this approach is to all services and products. There should never be an assumption that a particular product or service will not be utilised by a particular demographic. To highlight this I mentioned how I had utilised RunKeeper to learn to run solo outdoors. It would have been easy for RunKeeper to assume a blind person would not use their app. After all what use would this be to a blind person. But thankfully they did and I was able to achieve what was once perceived impossible, to learn to run solo outdoors.

I continued by saying this is why I wanted to work with IBM, I wanted to make sure every service and every product across all sectors became accessible. Just imagine the impact this could achieve with the number of partners and clients IBM work with. With accessibility an assumed standard across the board just imagine the impossible things that could be achieved in the next few years.

During the rest of the conversation IBM HR mentioned they could imagine me starring in an IBM commercial, demonstrating what accessible technology can enable people to do. Well if we fast forward 2 years that opportunity arrived. IBM gave me a call and asked if I could like to be featured in a little video. I of course said yes and the result is the video below.

In those past 2 years I have continued to try and make the world a more accessible place, through advocating for universal design, working with many tech firms and countless public speaking appearances at large tech events. But I still feel I could do so much more, there is still a need for that evangelist role and I am still a great fit. There is a real need to ensure universal design across the board. When that goal is achieved countless people will be enabled to achieve the impossible.

Looking forward to Apple TV OS

My favorite piece of technology in the living room is my Apple TV and it is about to see a significant update. I love the Apple TV for two reasons VoiceOver and Netflix. VoiceOver is fantastic at assisting in navigating the UI as it reads aloud all the elements and Netflix has fantastic audio described content.

However, it is limiting. I only access my media through Netflix but I have a world of other media. I have numerous DVD and Blue-Ray discs all with great audio described content. The problem is how I access this media. For example, identifying the discs or navigating the menus are both challenging and require sighted assistance. There just isn’t an a great accessible removable media device.

So the current solution is to rip these discs along with their audio described content and AirPlay them to my Apple TV. This allows me to use a screen reader to select the content I would like to listen too. But it shouldn’t be this hard and I hope the Apple TV can help in this respect.

For the first time the Apple TV has a SDK, meaning developers can create apps for the system. This brings with it the opportunity to access my other media through an accessible UI, this isn’t just hypothetical either as Plex have already announced their intention to release on the platform.

There is however one caveat, opening up apps on the Apple TV to using an SDK instead of creating an app under the strict UI guidelines of the past, gives developers free reign. With free reign may come the possibility of the apps no longer supporting VoiceOver, or if they do no guarantee all elements of the UI will be labelled. However, this would then merely be a software fix and I am confident developers would be willing to ensure their apps are as accessible as possible.

There is also another exciting feature of the new Apple TV – Universal Voice Search. This would reduce my need to interact with the UI significantly, now if I would like to watch the latest episode of a show or a movie I can just issue the command to Siri. It was also recently announced that this feature would roll out as an API, meaning apps such as Plex would have access.

This really does excite me, as instead of asking for help to find the DVD I would like to watch, then having sighted assistance to select the correct audio track and start the film, I can do all this myself. A simple voice command will allow me independence in viewing media.

The new Apple TV will retain its much loved spot, as it remains the most accessible media viewing device for the living room.

Learn to code in a day?

Learning to code in a day, the premise seems a little far fetched, so I was certainly intrigued by the event at Decoded in London.

With the breadth of possibilities of coding so large the focus of the day was on the specifics of creating an app that incorporated location data. Even this reduction in focus seems like a mammoth task, especially considering the course is not aimed at people with previous coding experience. In fact it is billed as aiding new comers in obtaining these skills in a day.

So with zero prior experience, is it possible to enable someone to create a location based app within a day? The quick answer is yes. Everyone on the course successfully created a location based app.

The day is broken into a few distinct learning methods, lectures, hands on and team tasks. These three different methods enable participants to gain a rounded knowledge of coding. The introduction lecture is a whirl wind tour of the beginnings of coding, I was disappointed that this didn’t feature Alan Turing, but it was a whirlwind tour after all! This lecture also included the technologies we would be utilising in order to create our app, HTML5, CSS and Javascript.

We quickly moved over to the computers and began to create our app. This takes place within an online environment created by Decoded called Playto. The real power of Playto is in its instant feedback, the environment is broken down into columns. As you type into the editor column, there is a live view column. This means you are given instant feedback on what you are creating. This is an incredibly powerful learning tool, as you can instantly see your level of progress. It is also worth noting that anyone can utilise Playto, not just participants of the course.

As the day progresses we were introduced to HTML and CSS and began to build the look of our website, with functionality reserved for after lunch. The functionality of the website, being its location information, is accomplished through Javascript and some backend tools that are beyond the scope of the single day course. This element was however covered in another lceture but it wasnt something we created ourselves.

After lunch it was time to make our apps location aware. The premise was to make that would allow you to check in within a certain radius of a location. If you were outside of the specified radius you would not be allowed to login. This simple premise has a whole host of possibilities and this was highlighted to me a few days later. A friend wanted to create an app that would have runners start at a set point and every hour each runner would have to be beyond a particular radius. As the time increased so did the radius. I realized that the app I create on coding in a day could easily be adapted to serve this function.

To complete our coding task we were broken into two teams, with each team assigned a coding research task to complete the project. This was an interesting learning experience for the day, as participants had the opportunity to communicate with team members in a way which previously may have been difficult and daunting. This is a fantastic skill that will transfer to the workplace and allow individuals to communicate with the engineers and developers.

With the team tasks complete and everyones app functional the coding in a day was complete. I realized just how empowering the day had been, in a single day everyone on the course now had the skills and confidence to create something themselves and importantly the ability to communicate with the relevant teams in their workplace. Coding skills are rapidly being highlighted as essential and perhaps so should courses like coding in a day. It has the ability to enable all team members to understand the process and language needed to communicate with development teams, which will truly become an essential skill as the workplace evolves.

The course for me personally reinvigorated my interest in coding, I returned home and spent the next few days researching location services within iOS and playing with PHP. I look forward to where I will be in a few months time and how much my own coding will have improved. It also reminded me of how much I enjoyed my previous career in the educational sector, it was facilitating others to learn that was the truly gratifying part of my job.

The questions…….

My favorite part of public speaking is the Q&A section at the end. Its interesting to be challenged by people all the time, I especially like questions that start with “I know its personal but…..”. These questions are usually challenging to answer and I do enjoy that. While that may sound scary to be stood on stage while possibly thousands of people stare and wait for an answer, it always leads to interesting trains of thought.

Recently at an event for PWC I was asked the question “What is your biggest dream”. Now looking back and with time to think about it, while I answered the question honestly I didn’t feel I gave the justification as to why. It is after all the why that makes it interesting.

The question was – “What is your dream”. I responded with “to be VP of accessibility at a major tech company”, then went on to discuss my dreams within the realm of adventuring.

This doesn’t answer, why, I want to be VP at a major tech company. Well that is because of a dream.

Access to information is essential for the advancement of anyone, from learning to simple day to day news gathering. For someone with sight loss that is immensely difficult. I cant just pick up a book, or magazine or even go to a website and read the latest information. Essentially the majority of traditional forms of information are beyond what I am capable of accessing. This can make education incredibly difficult and place the visually impaired at a severe disadvantage.

While studying for my degree this lack of access to information became incredibly apparent. While, a facility did exist to make information accessible – in an audio format, there was a substantial time delay. To the point where it would mean if I stood any chance of finishing my degree I would have to complete essays in 2-3 weeks. My dissertation was completed in 8, that was not by choice, that was a constraint introduced by access to information. I will quickly add that I was 4 points off the highest possible mark however!

But it is precisely this access to information I want to change. The mobile is truly a revolution in access to information and that is where great change can take place. Android has the largest market share in terms of devices and could make an incredible global difference through accessibility. As the next billion people come online, imagine enabling a visually impaired person for the first time to access a book, the days news, or even a menu at a local restaurant. This is all possible by utilising screen reading technology and OCR through smartphone cameras.

What is needed is rapid improvement in accessibility features, vast improvements in universal design and a focussed concentration on inclusive user centric design.

And it is precisely all these reasons I desire a senior role at a tech company, to help instigate that change and enable learning to all.

There is a place for this within other organizations and was a topic of conversation with IBM, any company that has a large consulting role has a wonderful opportunity. An opportunity to touch hundreds if not thousands of companies throughout the world. To initiate these changes and work towards a more inclusive focus for technology and services.

So thats the why, I just didn’t condense it on stage on the day!

Accessibility of Facebook on the desktop

Maximising the accessibility of a website is always of great importance. As well as the developers improving accessibility there are also a plethora of tools available that increase the accessibility client side. I always encourage the use of client side tools as it makes for a richer more seamless experience.

In the quick look video below I demonstrate the ability of Hacker Vision
for Google Chrome. Hacker vision is an extension which is capable of intelligently inverting websites, by which I mean will only invert text and background elements rather than images. This is a vast improvement over say the invert feature built into Apple hardware, as the built in invert operates at the hardware level the entire screen is inverted. Resulting in such elements as images becoming inverted. Hacker Vision negates that and makes for a far more pleasant web experience for the low vision user.

The video also demonstrates the different forms of zoom available to the user, and quickly compares window zoom versus font scaling. I believe font scaling to be incredibly powerful on the client side and is something I will touch on in a subsequent ThoughtCast.

I chose to demonstrate these features with Facebook, mainly because Facebook is often cited as having poor accessibility. I do not believe this to be true I believe a fairer assessment would be to say Facebook is doing a reasonable job and it plays relatively well with client side tools. However, it must be noted that these client side solutions will work on any website and in the case of Hacker Vision can even be tailored to invert only the websites you wish it to. Therefore, a website that does have a dark theme would not be inverted.
]

Apple and IBM great partners for an accessible enterprise

Accessibility in the workplace is often viewed as a difficult task. Thanks to the increasing IBM and Apple partnership this may become a thing of the past.

Through the mobile first initiative IBM are now beginning to offer the mac desktop as an enterprise solution. This removes one of the biggest barriers to accessibility in the workplace. With Windows being the dominant platform within the enterprise there was a need to install additional software to increase accessibility. This could be seen as adding additional complexity and cost to the system. While there are solutions in place to finance this accessible needs, the additional cost will not be required with the enterprise level switch.

Accessibility will be baked in to every single Mac desktop. No need for specific accessible software or hardware. The same desktop that other employees utilise can now be used by those with a need for accessibility tools. This has removed any additional cost and complexity to supporting users who require the use of accessibility tools.

I do hope this becomes the industry norm. The mac desktop is capable of transforming the work environment and finally giving a level playing field to all.

This accessibility also extends to the Apple mobile platforms. Therefore, the office of the future that includes desktops, iPad’s and iPhone’s will all have accessibility built in. This could truly be a watershed moment for accessibility in the enterprise.

Currently the mac desktop offers the best accessible business solution, the combination of OS X and LibreOffice as an office suite is highly accessible. Not to mention that LibreOffice is free and open source.

I am truly excited for the future of the enterprise and thankful of such a great partnership between IBM and Apple.

Raspberry Pi, Dropbox and syncing external folders

In order to silence my every so slightly aging iMac I decided to swap the mechanical HD for a SSD. This also had the added benefit of increased speed, with one downside. A large reduction in storage capacity, I had gone from 1Tb down to 240Gb.

Around the same time I had upgraded my Dropbox to the Pro account, 1Tb of storage. I thought my storage problem would be solved. Oh wait, hang on, I still need the files locally to sync with Dropbox.

This problem set me down the path of configuring my Dropbox to sync files outside of the Dropbox folder. But instead of the folder being on an external drive (too noisy!), I would place the external folder on a Raspberry Pi with an external drive, placed in another room. No noise!

The following is a guide on how you can configure a Raspberry Pi with a Samba share, which in turn can be symbolically linked on OS X to a folder within Dropbox. Therefore, allowing you to have a large synchronised version of your Dropbox files locally.

What you need

Raspberry Pi – Assuming Raspbian installed and running
External HD – For the Raspberry Pi
Dropbox account

Configure the external HD

For this particular setup I was concerned about speed. Therefore, I chose the ext4 format for the external drive. You can of course choose any other format such as HFS, NTFS or FAT for your drive, but this guide specifically deals with ext4.

Identify drive to format

To ensure you format the correct drive use the following command at the terminal on your Pi.

blkid

Your external drive (assuming only drive plugged in) will likely be listed as /dev/sda1 followed by the UUID, take note of this, as you will need it later.

Format the drive

The following code will format the drive on sda1 (change to your drive if it differs) with ext4.

sudo mkfs.ext4 /dev/sda1

Create a folder to mount the drive to

I chose the name “cloud” for my folder, but feel free to change this.

sudo mkdir /media/cloud

Auto mount the drive

In order for the USB drive to mount on reboot we have to amend the fstab file with our drives UUID, mount point and permissions. You can obtain your drives UUID with the blkid command from above.

sudo nano /etc/fstab

We need to add the following line replacing XXXX with the UUID of your own drive

UUID=XXXX /media/cloud ext4 defaults 0 0

Test the auto mount

To test the auto mounting without a reboot type

sudo mount -a

Hopefully, there should be no errors and you can check the drive has mounted wiwth the df command. If you did receive an error, go back to the step above and make sure there is not a type within the fstab file.

Permissions

To ensure to the user pi has full permissions issue the following commands

sudo chown pi:pi /media/cloud
sudo chmod 777 /media/cloud

Configure Samba

First we must install Samba

sudo apt-get install samba samba-common-bin

Next we have to configure the smb.conf to share our USB external drive with users on the network.

sudo nano /etc/samba/smb.conf

At the bottom of the smb.conf file add the following text. This will share the /media/cloud directory. Which we mounted our USB drive to, as “cloud”

[Cloud]

comment = Cloud
path = /media/cloud
valid users = @users
force group = users
create mask = 0660
directory mask - 0771
read only = no

Samba permissions

This configuration assumed you will login to the Samba share with credentials. In order to add the user pi to samba use the following.

smbpasswd -a pi

Raspbian configuration complete

The configuration on the Pi is now complete. Assuming you followed all the naming conventions above. You will now have a share on the server “raspberrypi” (the default server name on a Raspbian server), named “cloud”. Which, you are able to connect to with the user “pi” and the password you chose from the final step above. Now all we need to do is configure Dropbox on OS X.

Configure Dropbox

This is the simple part! In the OS X finder there should be a server in the sidebar named “raspberrypi”. Click on this and then click the “connect as” button. Use the credential from the final Samba configuration, so username “pi” and the password you chose.

Link an external folder to your Dropbox

Open the terminal on OS X and use the following command

ln -s /path/to/desired-folder ~/Dropbox/desired-folder

The easiest way to achieve this is to type “ln -s” at the command line then drag and drop the “cloud” folder from the mounted raspberrypi server to the terminal. Then do the same with the desired folder from your Dropbox account.

And thats it! Now any file you place in the folder in your Dropbox will be stored on the external share on Raspberry Pi. Therefore, not taking up space on your small SSD!

Benchmarks

I benchmarked this setup with a very quick and perhaps dirty test. Using nload, I copied over 40Gb of media data. With varying file sizes, from 4Mb to 4Gb. I received an average throughput of 30Mb and a peak of 42Mb. This is absolutely quick enough to use for this purpose, as the files will not sync to Dropbox anywhere near that speed.

Hang on……

Couldm’t you just link a folder in Dropbox to an external drive on your Mac?

Yes, but then you would have the noise of a spinning drive!

But these files are stored on a server, not locally

I used the term local, to mean within my local vicinity.

Will this work with BOX, OneDrive, etc….

Yes, this will work with anything.

Did you just make this complicated to use a Raspberry pi?

Yes, but I love using Raspberry Pi’s for specific uses. They are so cheap and so much fun to play with.

An accessible oven?

Continuing my foray into the kitchen, I am amassing an even larger collection of specific kitchen gadgets. With the new diet commencing, I had a need for omelettes. In an attempt to be a little healthier I use more whites than yolks. To aid in splitting the whites from the yolks I purchased an egg separator. It works surprisingly well and acts as a reminder. There is often a solution to a problem, you just have to look for it.

 

It is often these gadgets created for very specific use cases that enable me to function in the kitchen. While never envisaged to be used for the blind their highly specialised function often makes them suitable for myself.

 

I have found there are numerous gadgets that aid in the prperation of food but not in the actual cooking. I feel this is because the oven. Hob and microwave don’t receive much focus in terms of specific use cases, and therefore, do not see large functional improvements.

 

Well at least that was what I thought until I heard about the June oven. Through an integrated high definition camera the June oven is able to identify what you are attempting to make and can suggest estimated cooking times. Right now it can identify chicken, steak, white fish, salmon, bacon, cookie dough, brownies, bagels, toast and burger buns. For a full breakdown of the ovens capabilities its worth checking out this The Verge articles on its capabilities.

 

The oven is also equipped with WiFi and a touch screen and is able to live stream the cooking process. Along with its ability to estimate cooking time it was the WiFi and touch screen that really stood out to me. With this system having WiFi it doesn’t seem a stretch of the imagination to be able to control the oven through a mobile app.

 

Imagine an oven I can control through an iPhone app. Be able to set the temperature, have the oven identify that I want to cook a steak and it suggest a cooking time!

 

This would literally be a game changer in the kitchen for me and open up incredible possibilities in what I am able to cook easily and independently. Pairing it with other technology in the kitchen I can see myself being able to create high quality healthy dishes for the first time in my life

 

So June, I am here, I will be your beta tester. Lets make an oven that can transform peoples lives.

Review – FitBit Charge HR

I am a great fan of anything related to fitness tracking. I am constantly testing different wearables to identify one that not only tracks useful information, but is accessible. I was excited to hear about the FitBit Charge HR, as I have become interested in tracking my heart rate. The following review is thanks to FitBit allowing me to test out the Charge HR, in order to highlight how useful it can be to someone with a visual impairment. The FitBit Charge HR is a watch type wearable that is able to track steps taken, heart rate, floors ascended, distance moved and calories burnt.

OLYMPUS DIGITAL CAMERA

Setup

For the visually impaired market there are not many consumer goods that can be purchased and configured without sighted assistance. The one exception being Apple, well I say one, there are now two exceptions. As the other is FitBit. I was pleasantly surprised by the configuration of my Aria WiFi Scales, as this could all be achieved from my iPhone. I was further surprised when the same could be said for the Charge HR. As the configuration takes place within the FitBit app it means the entire process can be assisted with VoiceOver, there is however, one little caveat – Bluetooth pairing. This requires you to input a 4 digit number displayed on the screen of the Charge HR, the screen is high contrast so a low vision user would be able to pair. If you are unable to rely on sighted assistance this step can be overcome by using a service such as TapTapSee or BeMyEyes, as the screen is of high enough contrast for it to be easily seen by either of these services.

The ability to configure a device out of the box yourself is refreshing. Especially when the product has not been specifically designed for the visually impaired market. It highlights how achievable these things are.

Application

I first began to use the FitBit application with my Aria WiFi scales and it was that experience which made me seek out the Charge HR. The application is very accessible, the main functions are all labelled appropriately. Therefore, a blind or low vision user could easily use this application to track steps, record exercises and track sleeping habits. However, it is once you want to dig a little deeper that a few accessible issues arise. This however, is more to do with how information is presented, complex data is often represented by bar charts, pie charts and graphs. I can understand why this is the case. However, if there was an option to view the information in a tabulated form it would be far more accessible. I must make it clear that you can access the information, it just takes a little thinking to understand what it means. The video at the bottom of this article demonstrates this.

Heart Rate and Exercise Tracking

For me the opetical heart rate tracking capabilities (no need for a chest strap) of the FitBit Charge HR were the real big draw. While I have owned heart rate trackers in the past, due to my inability to view the watch screen, they have been a post race element. Something I could analyze afterwards but unable to access in real time. Therefore, when I placed the Charge HR on my wrist, opened the app and got a real time readout of my heart rate, this was the first time I could access such information. It is easy for this to be quickly passed over, but for me this was a powerful moment, something that was impossible to do before, was now possible.

This information can also be viewed on the screen of the Charge HR, if you have reasonably acuity and even a few degrees of vision, the screen is viewable.

As the Charge HR records your heart rate throughout the day, it is able to give curent heart rate as well as resting rate. It is actually the resting rate I am interested in. For some reason I have become a little obsessed with a low resting heart rate, and am often find myself with heart rate envy at my friends insanely low resting rates. So this is a nice little touch on the heart rate side.

Your heart rate is also logged during exercise, and post exercise analysis highlights the different heart rate zones as well as time spent in the zone. This activity tracking is where the accessibility of the application becomes difficult. There are some data points which are difficult to convey without the use of pie charts in this section. Therefore, VoiceOver struggles as none of these items are labelled sufficiently to give an indepth sense of what is going on. I also included an insight into this in the video at the bottom of the article.

Steps

OLYMPUS DIGITAL CAMERA

Step accuracy is always up for debate, so just how accurate is the Charge HR? Well I wear the Charge HR on my dominant wrist, so I thought this may affect the count substantially. Forunately, there is a setting with the app to choose which wrist you wear the Charge HR. I decided the only true way to test was to count out some steps. So I strode out 166 steps around the house and the Charge HR got it exactly right. I do conceded that 166 is a low number, but I didn’t fancy counting out 2,000 steps to check the accuracy. Now that isnt exactly the truth, I did try, while on the treadmill. But then I got distracted and lost count!

One problem I hadn’t considered when wearing the watch on my dominant wrist was using a long cane for mobility. I usually use my guide dog, but now and again it isn’t appropriate or too difficult to take my dog with me due to travel plans. In these situations I use the long cane, with a sweep or tip tap motion depending on surface. I thought this may stump the Charge HR, as it would remove the arm swing, which I had assumed as used as a variable for tracking steps. However, with a little count off the Charge HR also showed itself to be reliable when using the long cane. Incidently it is find with the guide dog, as I hold the harness in my non-dominant hand.

OLYMPUS DIGITAL CAMERA

Silent Alarms

The FitBit Charge HR supports silent alarms. This is a surprisingly great feature for the visually impaired. The alarams are set through the app, therefore, very accessible. Meaning it is incredibly easy to set multiple alarams, that repeat on specific days. For example. My main alarm is for Monday, Wednesday and Friday, as these are the days I wake earlier to workout. It really is an underrated feature, to have the ability to create a series of complex alarms that are a series of vibrations on your wrist.

Challenges

The challenges on the FitBit Charge HR are surprisingly motivational. I would find myself setting a challenge with a friend and pacing around to increase my step count. Perhaps this highlights my sheer level of competitiveness, or that a fitness tracker like this does indeed promote you to move more. However, the challenges system is very difficult to use for the visually impaired, this is due to how the information is presented. Your relative position in the challenges, be it 1st, 3rd or 5th is presented in chart form. This is completely inaccessible to VoiceOver and nothing is read out, the only piece of information accessible through VoiceOver is how many steps you are from the lead, or indeed how far you are in the lead. The video at the bottom of this review briefly touches on this.

Sleep mode

The sleep tracking system of the FitBit Charge HR appears to work very well. It accurately calculates when I fall asleep and when I wake, with moments of restless through the night. Unfortunately however, it suffers the same problem as the challenges and activity tracking. It struggles to convey the information through VoiceOver, you can often find yourself wondering how relevant the information is. But if you are after tracking the simple elements such as duration spent sleeping, number of restless moments and number of times awake it appears to work flawlessly.

Battery

As a blind user the battery was something that concerned me. Often the only indication a wearable’s battery is about to die is visual. So you are often left puzzled as to why your watch no longer tells the time, or why those vital steps are no longer being logged. So I was pleasantly surprised that there are not one but three accessible ways the Charge HR notifies me of a low battery. The first way I found particularly cute, I was sent an email informing me of a low battery. You can also access this information by a notification sent by the application as well as accessing the current level in application. This is particularly helpful when I am travelling for a few days. As I can be sure that the battery level is high.

The battery itself manages around 4-5 days of usage, but I believe this is perhaps reliant on how often the screen is used and indeed your daily step count. For example, on some days I may lot close to 60,000 steps, so I would not expect it to last too long with that rate.

Feel

Upon first wearing the Charge HR it can feel a little strange. I believe this may be a result of the protruding heart rate sensor. However, after a short while accomodation kicks in and you are no longer aware you are wearing the Charge HR.

it is also worth a quick mention about skin irritation. When I first began to wear the Charge HR I did find myself itching a little. Now I am firmly putting this down to the media attention around the issue and the fact during configuration of the Charge HR, it is mentioned. Almost like the affect is being primed, so when you do get a little itch it is quickly attributed to this issue.

However, I can safely say I no longer notice any form of irritation. I even wear the Charge HR in the shower and don’t suffer any irritation.

Overall

I would highly recommend the FitBit Charge HR to a visually impaired user. FitBit achieves something that only a few products are able to, be accessible out of the box. That is especially impressive when you realise it is a mass consumer device and not something specifically made for the visually impaired community. The issues that the FitBit Charge HR does suffer from are all software issues and more precisely issues conveying complex imformation. This is something that is not limited to FitBit and is more a comment on the industries focus on infographics, I would like to see a nice balance between prose and infographics. This could easily be achieved by indepth labelling of elements, as VoiceOver could then give the vital context provided by the graphical elements. However, because this is software, these issues could certainly be addressed in the future and even if not, the device is incredibly accessible as it is. Therefore, if you are looking for a fitness tracking device I highly recommend the FitBit Charge HR.