Creating, editing and presenting slides as a blind teacher

As many teachers will tell you having some slides makes teaching a lesson far easier. So how do you create the slides when you can’t see?

PowerPoint and Google Slides work ok in presentation mode with a screen reader, assuming no complex images and standard flow. However, as soon as you switch to creating or editing that accessibility quickly changes. Adding titles, lists and text elements is a laborious task. It just takes forever, what may take a sighted person a few minutes to create, could take a screen reader user 10 or 20 times longer. Imagine that, your main teaching resource taking 10 or 20 times longer to create each time.

So what’s the solution? How do i generate slides, present them to students and mask the screen reader? Oh, and right now do all that virtually.

The answer is markdown, reveal-md and and AirServer. If you don’t teach computing all those terms may look a little alien.

So lets start with a sample presentation,, Quick demo deck.

The markdown text below is what generates that slide deck.

Quick slides

How fast 🚀

  • Super fast slides
    • with simple access to lists

    Easy Headings

    That get smaller

    & smaller

    It’s that simple

    • quic, simple formatted slides
    • Text access to images, video, tables & everything you can think of!

As you can see its just plaintext. The punctuation does all the formatting of titles, lists, text and images. Anything you can think to put in a PowerPoint or google slides deck you can do with plaintext and some punctuation.

But how do i mask the screen reader from the students? After all when i present i need to know what the slides say, but I don’t want to expose the screen reader to the students. It would be highly confusing and distract from my teaching.

The key here is an app called AirServer and an iPad. After generating my slides with reveal-md i access them on my iPad. I then use AirPLay mirroring from the iPad to AirServer running on a windows laptop. This separates the audio, so with some headphones i receive the screen reader audio and all the students see are the slides. The beauty of this setup is it works just as well in a physical classroom as a virtual one.

In the physical classroom the AirServer screen is displayed on the interactive white-board and i control it through my iPad. In the virtual room the AirServer is the shared or presented app on the virtual call.

Both setups allow me to deliver the lesson without students aware i am accessing the slides using a screen reader. The true beauty however, i can create a presentation in minutes. I dare say i am faster than a sighted person using PowerPoint or google slides. After all we both have to type the same text. But while they are clicking around the app to make titles, or add lists, all i do is type some punctuation and all that formatting is taken care of.

It’s a beautiful way to generate slides and importantly incredibly accessible to create, view and edit.

Lightweight night vision goggles

Night blindness is a common issue for people with low vision, especially those with Retinitis Pigmentosa. While your vision may be adequate for mobility in daylight, as the night draws in and contrast begins to drop, night blindness occurs. 
When i had sufficient vision for this to be a problem for me, I was always tempted by night vision goggles. There have even been research projects exploring this possibility. The good news is it can really help with mobility, the bad news night vision goggles are expensive, cumbersome and heavy.
Due to these restrictions i never quite took the plunge. But an interesting development once again has me intrigued in night vision. Thanks to a new breakthrough the advantages of night vision goggles can be had in a spectacle frame. There is still a need for external power, but great to see this moving forwards.
As augmented reality products advance it would be great to see this technology integrated to enable low light navigation.

Night vision goggles

Blind hiring? Use the blind

Technology has a diversity problem, as do many other companies. An immediate point of change is the hiring process, my interest was peaked from a comment by Leslie Miley, of Slack. It was proposed that a blind assessment process is used during hiring, stripping applications of identifiable data.

This is an interesting proposition and similar to one i have been proposing for a while. Don’t simply do blind assessments, use blind people to do the hiring.

Passed the application assessment stage, blind people really come into their own. The inability to see the applicant massively reduces implicit bias. It cannot be overstated how important it is to remove those unconscious bias that we all possess but find it difficult to identify. Removing the ability to visually trigger these unconscious biases will assist in improving the diversification of the hiring process.

But couldn’t you just wear a blindfold? Why use someone who is blind?

Apart from this being a terrible gimmick, social interactions can be difficult when you remove the vision of one participant. However, blind people have had years to perfect non visual interactions. To the point where if I dont have my guide dog or cane with me, in social interactions no one ever realises i am blind. I can maintain eye contact – which, greatly eases the comfort of the other participant, something a blindfolded participant would be unable to do.

Blind people have also been spending many years understanding how to read people without visual cues. Actually listening to someone, rather than adding a level of visual distraction. These advanced listening skills are something that take years to hone, and blind people have been perfecting them their entire lives.

So if yo want to diversify your hiring process, start by diversifying your hiring team.

Mixed reality systems

Project Tango seemed like a revelation a couple of years ago, a system that could do 3D mapping of enviroments in a small package. Now with the demands of inside out tracking for gaming we are starting to see other products hit the market.
I still feel this technology has a long way to go, eventually being shrunk down to a sensor that is as small if not smaller than today’s front facing phone cameras. Once we arrive at that point we enter the realm of discrete technology that is capable of augmenting reality in interesting ways.
I really see this being a product that is immensely helpful for the assistive technology arena. I will definitely be shaping the future of such products.

Mixed reality systems

Windows running on ARM

Exciting news from Microsoft that future versions of windows will run on ARM. Perhaps even more impressive it will emulate 32-bit x86, there is even a demo of Photoshop running on a Qualcomm 820.
If Microsoft can really improve the Narrator as has been mentioned recently this could be a great unification across all their devices. Not to mention it may instantly solve their lack of apps on mobile, opening the door for a Surface mobile phone.
Windows 10 to run on ARM

Computer vision for the blind

With computer vision rapidly improving, it was only a matter of time before we began to see head mounted computer vision systems. Horus, has a unique approach in that it doesn’t rely on connectivity for the visual processing. That means it will even work when the data connection is down. It covers some interesting basics of computer vision for the blind, reading and facial recognition for example. It does however, suffer from what i always conside the ultimate pitfall in these products. it was designed specifically for the blind, meaning the cost is high, as the market is small. 
There is definitely space for a head mounted digital assistant. So with a little shift in the market this product could be aimed at a wider spectrum bringing down the cost. Therefore, making it highly accessible.
However, this is a wonderful step forward and I am looking forward to seeing where products like this go.

COmputer vision for the blind

The iPhone, Twitter and Night Mode

Night mode was brought to Android Twitter last month, so it was only a matter of time before it landed on the iPhone. I believe Apple could take this one step further though. I would like to see night mode an OS level option. With apps having alternative themes for night mode that are triggered as you toggle Night Mode in the OS. This would be far simpler than toggling it in a per app basis. I would say it’s likely Apple may introduce this in 2017, to pair with the OLED screen, simple because it will improve battery life.