Blog

How smart subtitling technology could help deaf people

Author: William Mager | Original article: BBC News | Reproduced with kind permission of the author

subtitling on Glass

Subtitling & new technologies

There are many new technologies that can help people with disabilities, like live subtitling 24/7 for deaf people, but how well do they work?

Deaf people always remember the first time a new technology came on the scene, and made life just that little bit easier in a hearing world.

I’ve had many firsts. Television subtitles, text phones, the advent of the internet and texting all opened up opportunities for me to connect with the wider world and communicate more easily.

So when I first heard about Google Glass – wearable technology that positions a small computer screen above your right eye – I was excited. Live subtitling 24/7 and calling up an in-vision interpreter at the touch of a button. Remarkably both seemed possible.

That was a year ago. Since then, Tina Lannin of 121 Captions and Tim Scannell of Microlink have been working to make Google Glass for deaf people a reality. They agreed to let me test out their headset for the day.

First impressions

First impressions are that it feels quite light, but it is difficult to position so that the glass lens is directly in front of your eye.

Once you get it in the “sweet spot” you can see a small transparent screen, it feels as though it is positioned somewhere in the distance, and is in sharp focus. The moment you get the screen into that position feels like another first – another moment when the possibilities feel real.

But switching your focus from the screen to what’s going on around you can be a bit of a strain on the eyes. Looking “up” at the screen also makes me look like I’m a bad actor trying to show that I’ve had an idea, or that I’m deep in thought.

The menu system is accessed in two ways. There is a touch screen on the side which can be swiped back and forth, up and down, and you tap to select the option you want.

Screenshot of Google Glass screen with subtitling

Google Glass can be used for live subtitling

Or you can control it by speaking, but this can be difficult if you’re deaf. Saying “OK Glass” to activate voice commands can be a bit hit and miss if your voice is not clear, like mine.

One of the main problems is the “wink to take a picture” setting. But I wink a lot. I also blink a lot. So I turn that setting off.

After a few minutes of reading online articles via Glass, it’s time to test out live remote captioning software in the real world. Lannin and Scannell’s service is called MiCap, a remote captioning service that works on several platforms – laptop, tablet, smartphone, e-book and Google Glass.

We set up in a quiet meeting room. After some fiddling with wi-fi and pairing various devices, we put a tablet in the middle of the table as our “listener”, and put the headset on. As three of my colleagues engage in a heated discussion about the schedule for programme 32 of See Hear, the remote captioner, listening somewhere in the cloud, begins to transcribe what they are hearing.

subtitling

My first reaction is amazement. The captions scrolling across the screen in front of my eye are fast, word perfect, with a tiny time delay of one or two seconds. It is better than live subtitling seen on television, not to mention most palantypists who convert speech to text. I can follow everything that is being said in the room. Even more impressively, this is the first time that the app has been tested in a meeting. I can look around, listen a bit, and read the subtitles if I miss something.

But after a while, tiredness overtakes excitement, and I take the headset off.

See Hear is broadcast on BBC Two at 10:30 GMT on Wednesdays – or catch up on BBC iPlayer. The new series begins on 15 October 2014.

The headset itself is uncomfortable and fiddly, but despite this my first experience of Google Glass was enjoyable. It doesn’t offer anything that I can’t already do on my smartphone but the ability to look directly at someone at the same time as reading the subtitles, does make social interaction more “natural”.

Future developments

I am excited about the apps and software being developed by deaf-led companies in the UK. Not just remote captioning – also remote sign language interpreting. UK company SignVideo are already the first to offer live sign language interpreting via the Android and iOS platforms, and say that they’ll attempt a Google Glass equivalent in the future if demand is high enough.

Other companies such as Samsung and Microsoft are developing their own forms of smart glass and wearable technology and as the innovations reach the mainstream the range of applications which could help disabled people seems likely to grow.

There are lots of exciting tech firsts to come but I still prefer a more old-fashioned technology – the sign-language interpreter. They’re temperamental, and they might make mistakes too, but they’re fast, adaptable, portable – and they don’t need tech support when things go wrong.

Follow @BBCOuch on Twitter, and listen to our monthly talk show

Google Glass: Meet the Glass Guides

Guides are the first people you meet when you get Glass, and they know just about everything there is to know about it. They help set you up and make sure you’re fitted properly. If you have any questions, issues, or just want to chat, they’re always there for you – on email, phone, and across all Google Glass’ social platforms.

They come from incredibly diverse backgrounds and are seriously some of the smartest, funniest people on the team. See what they have to say for themselves…

This video was recorded entirely through Glass. Sadly, it is not captioned, but check out the transcript below (thanks to our captioner Michelle!).

Transcript – Meet the Glass Guides

>>  I’m from San Francisco.

>>  Huntington beach California.

>>  Washington, DC,

>>  Colombia, South Carolina.

>>  City of Angels, baby!

>>  Brooklyn New York.

>>  St. Paul, Minnesota.

>>  Before I came into the Glass team I was doing a lot of different things actually.

>>  I taught marine science.

>>  I worked on Google plus.

>>  I acted all over in regional theatres.

>>  Assistant women’s basketball coach.

>>  I worked in university as a lecturer.

>>  I love meeting all of the Explorers because they all have really great stories.

>>  The best thing about helping Explorers, is just being the first point of contact.

>>  I’ve been lucky enough to get to travel around and see a lot of cool places, meet a lot of cool people, all while introducing them to a really cool technology.

>>  Watching them open up the box and be incredibly excited and then giving them the tools they see in class, after they leave.

>>  Our team?  Very fun, very funny, intelligent people, and just very cool and flexible.

>>  I would describe the team as weird and wonderful.

>>  Everybody’s really fun.  Everybody’s really quirky, and everyone’s just a blast.

>>  We love each other so much.

>>  Nobody else cares …

>>  Everybody cares.

>>  … except me!

>>  It’s like being part of a big family.

>>  It’s kind of like coming to work with your friends every day.

 

 

Transcript credit: Michelle Coffey

Credit: Phone Arena

Glass captioning

Google Glass: How to use Glass hands-free

Learn how to activate the Glass screen, respond to notifications and use some other basic features, all without using your hands.

One of our stenographers is writing real-time captions (CART, or verbatim speech-to-text) for a deaf surgeon in the USA who wears Glass. He uses an iPad with a microphone, and hangs it on the IV pole. The stenographer has special settings and a link for the Glass. It really is liberating for this client to have realtime captions as they work.

The first surgeries streamed using Google Glass were performed in June 2013.

Dr Grossmann, member of the Google Glass Explorer program, performed a world-premier surgery with Glass in the USA. This surgery was a PEG (Percutaneous Endoscopic Gastrostomy). The truly technological advance was to be able to stream the contents of a surgery to an overseas audience.

The second surgery was a chondrocite implant performed in Madrid, Spain, broadcasted to Stanford University. Dr Pedro Guillén streamed and consulted simultaneously a live surgical operation, enabling Dr. Homero Rivas – at Stanford University – to attend and provide useful feedback to Dr  Guillén in real-time.

Dr. Christopher Kaeding, an orthopaedic surgeon at The Ohio State University Wexner Medical Center, used Google Glass to consult with a colleague using live, point-of-view video from the operating room. He was also able to stream live video of the operation to students at the University;

“To be honest, once we got into the surgery, I often forgot the device was there. It just seemed very intuitive and fit seamlessly.”

This was the first time in the US that Google Glass had been used during an operation and it was only used at a very basic level. Possible future uses of this technology could include hospital staff using voice commands to call up x-ray or MRI images, patient information or reference materials whilst they are doing their ward rounds. The opportunity to be able to get the information they need in an instant could have a significant impact on patient care. Another opportunity Google Glass offers is the ability to collaborate with experts from anywhere in the world, in real-time, during operations.

Clínica CEMTRO, in Madrid, is currently conducting a broad study including more than a hundred universities from around the world, to understand which could be the applications of Google Glass technology on e-health, tele-medicine and tele-education.

 Credits: Ohio State University Wexner Medical Center, Clínica CEMTRO

Google Glass: Getting Started

An introduction to the basics of Google Glass. Learn about the touchpad, the timeline and how to share through Glass.

We welcome the initiative “My community, a city for everyone,” which aims to transform Dubai into a disability friendly city by 2020. His Highness Hamdan Bin Mohammed Bin Rashid Al Maktoum tries out Google Glass. We hope to give you some feedback soon from our team in the Middle East, so keep your eyes (and ears) peeled!

Glass

Hamdan Bin Mohammed Bin Rashid Al Maktoum tries Google Glass

We are delighted to be mentioned in the prestigious Journal of Court Reporting this week.

We provide realtime captions to the wearer – giving access to everyone. Watch out for news and user reviews of Google Glass with live captioning / CART by our deaf team and consumers. As deaf consumers, we think the Google Glass for remote live captioning is ideal as we can lipread a speaker or look at a speaker / interpreter and at the same time, read the realtime captions of what is being said.