Live subtitles: How smart technology could help deaf people

A close up of William wearing the Google Glass. The glass screen goes directly across his eye and his is pressing a button at the side of the glass

There are many new technologies that can help people with disabilities, like live subtitling 24/7 for deaf people, but how well do they work?

Deaf people always remember the first time a new technology came on the scene, and made life just that little bit easier in a hearing world.

I've had many firsts. Television subtitles, text phones, the advent of the internet and texting all opened up opportunities for me to connect with the wider world and communicate more easily.

So when I first heard about Google Glass - wearable technology that positions a small computer screen above your right eye - I was excited. Live subtitling 24/7 and calling up an in-vision interpreter at the touch of a button. Remarkably both seemed possible.

That was a year ago. Since then, Tina Lannin of 121 Captions and Tim Scannell of Microlink have been working to make Google Glass for deaf people a reality. They agreed to let me test out their headset for the day.

First impressions are that it feels quite light, but it is difficult to position so that the glass lens is directly in front of your eye.

Once you get it in the "sweet spot" you can see a small transparent screen, it feels as though it is positioned somewhere in the distance, and is in sharp focus. The moment you get the screen into that position feels like another first - another moment when the possibilities feel real.

But switching your focus from the screen to what's going on around you can be a bit of a strain on the eyes. Looking "up" at the screen also makes me look like I'm a bad actor trying to show that I've had an idea, or that I'm deep in thought.

The menu system is accessed in two ways. There is a touch screen on the side which can be swiped back and forth, up and down, and you tap to select the option you want.

Image caption,

Google glass can be used for live subtitling

Or you can control it by speaking, but this can be difficult if you're deaf. Saying "OK Glass" to activate voice commands can be a bit hit and miss if your voice is not clear, like mine.

One of the main problems is the "wink to take a picture" setting. But I wink a lot. I also blink a lot. So I turn that setting off.

After a few minutes of reading online articles via Glass, it's time to test out live remote captioning software in the real world. Lannin and Scannell's service is called MiCap, a remote captioning service that works on several platforms - laptop, tablet, smartphone, e-book and Google Glass.

We set up in a quiet meeting room. After some fiddling with wi-fi and pairing various devices, we put a tablet in the middle of the table as our "listener", and put the headset on. As three of my colleagues engage in a heated discussion about the schedule for programme 32 of See Hear, the remote captioner, listening somewhere in the cloud, begins to transcribe what they are hearing.

My first reaction is amazement. The captions scrolling across the screen in front of my eye are fast, word perfect, with a tiny time delay of one or two seconds. It is better than live subtitling seen on television, not to mention most palantypists who convert speech to text. I can follow everything that is being said in the room. Even more impressively, this is the first time that the app has been tested in a meeting. I can look around, listen a bit, and read the subtitles if I miss something.

But after a while, tiredness overtakes excitement, and I take the headset off.

The headset itself is uncomfortable and fiddly, but despite this my first experience of Google Glass was enjoyable. It doesn't offer anything that I can't already do on my smartphone but the ability to look directly at someone at the same time as reading the subtitles, does make social interaction more "natural".

I am excited about the apps and software being developed by deaf-led companies in the UK. Not just remote captioning - also remote sign language interpreting. UK company SignVideo are already the first to offer live sign language interpreting via the Android and iOS platforms, and say that they'll attempt a Google Glass equivalent in the future if demand is high enough.

Other companies such as Samsung and Microsoft are developing their own forms of smart glass and wearable technology and as the innovations reach the mainstream the range of applications which could help disabled people seems likely to grow.

There are lots of exciting tech firsts to come but I still prefer a more old-fashioned technology - the sign-language interpreter. They're temperamental, and they might make mistakes too, but they're fast, adaptable, portable - and they don't need tech support when things go wrong.

Follow @BBCOuch, external on Twitter and on Facebook, external, and listen to our monthly talk show