Blog

CERA Awards

Customer Experience Recognition Awards

CERA Awards Info Dev World 2014

The 2014 CERA Winners | Photo courtesy of Content Rules Inc.

Do you offer great customer service?

As more and more organizations of all sizes and across all industries recognize the importance of content in supporting business relationships with customers, content producers face tremendous challenges and opportunities, that they are meeting in all sorts of innovative ways. The Customer Experience Recognition Awards (CERAs) recognize outstanding contributions to the customer experience made by content developers.

Developed in conjunction with Information Development World, the CERAs recognition ceremony takes place at a special luncheon on October 1, during the iDW conference. First, we need to collect the best content-driven customer experience initiatives. Then we’ll have some of the leading experts in customer experience, technical communications, content marketing, accessibility, localization, product management, and content strategy review the projects to determine which ones are worthy of receiving a CERA.

We invite you to nominate and submit an entry online between 9 am (ET) May 11, 2015 and 5 pm August 29, 2015 in one of the following categories: Accessibility, Customer Support – Technical Communications, Employee Engagement, Information Discovery, Translation/Localization and User Community / Social Media.

As we embark on the 2015 CERAs, we will share additional information on judges, advice on submissions, and details of the recognition ceremony. Sign-up for updates here. We hope you’ll join us.

realtime captions at awards ceremony

Al Martine presents the awards ceremony with realtime captions on a large screen

Realtime captions at CERA

At last year’s event, 121 Captions demonstrated the benchmark for good accessibility for deaf and hard of hearing customers. 121 Captions was a sponsor of the Information Development World 2014 conference in Silicon Valley and we had a booth to showcase our captioning services. Tina was a judge at the Customer Experience Recognition Awards so she attended, and we provided realtime captions (CART) for the event. CART is Communication Access Realtime Translation, captions produced in realtime by a stenographer. This event was in Silicon Valley, and we had booked one of our CART writers in Kansas. She simply phoned into a laptop at the venue, which had been set up next to the presenter’s microphone, and listened to the speeches. As she wrote, the captions streamed over the internet and showed up on the large screen in less than 1 second.

Val Swisher tweets about realtime captions

There were tablets and iPads set up on the tables, and the realtime captions were streamed to these at the same time as on the big screen. We are so proud to have the fastest multi channel realtime captions streaming platform available today. Access which is almost instant means real inclusion for deaf people, which equates to great customer service!

realtime captions on iPad

iPad screenshot of realtime captions

Providing the best quality realtime captions meant we could really drive home the point that good customer service is easy to do – in this case, making an event accessible to hard of hearing and deaf delegates. There were some deaf delegates in the audience who had not, prior to the event, disclosed their hearing loss, and they were delighted to have full access. They had never seen realtime captions before. The feedback from the delegates on Twitter showed that even those who could hear, were able to appreciate having realtime captions. Captioning is not only for those with a hearing loss, it is useful for hearing people too.

twitter marcia

Find out more about our captioning service

Contact us at bookings@121captions.com or give us a call on +44 (0) 20 8012 8170

SDH subtitles

How subtitles are made for TV

 

This video explains how subtitles or closed captions are made for TV.

1capapp live captioning

Our live captioning platform

live captioning

Our live captioning streaming software is the most stable, secure, and fastest in the world, streaming live captions to you in less than 3 seconds. Our captioning platform is the ideal solution for professionals who wish to have their teleconference calls or events captioned. It is now even better!

Our live captioning platform delivers time-saving solutions for CART providers, captioners and court reporters who provide live captioning streaming for business meetings, classroom lectures, depositions, stadium captioning, as well as Adobe and WebEx collaboration. Broadcast Engineers are excited about the ability to provide captions for live video webcasting while at the same time, sending closed captions to TV.

In the last few months alone we have added the following enhancements to to an already robust platform:

For the Viewer:

  • Streaming text formatting:  Coloured Highlighting, Bold, Underline Font colours etc and saved with the transcript.
  • Streaming text speed control:  The viewer can control how fast the captions arrive to the screen making it easier to follow what is being said.  The words flow smoothly across the page at the slower pace.
  • The Stream Box, Notes Box & Chat box are all resizable.  Plus the stream box can be opened in a new window.
  • Optimized view screen for mobile devices.

For the Captioner

  • Test Stream:  Writer can test the communication link free without accumulation of session time.
  • Easily managed recurring sessions
  • Macro for: Subscript ; Superscript; Bold; Underline
  • Macro for:  New Line –  Solves problems with Feeding Scripts using F12 key
  • Macro for: New Speaker Colour –  Automatically changes the text color of a new speaker.

To find out more about our live captioning platform

Contact us at bookings@121captions.com

laptop view

Tina Lannin Nominated for the UK’s Largest Diversity Awards

Entrepreneur-of-ExcellencePress release

Tina Lannin Nominated for the UK’s Largest Diversity Awards

Tina Lannin, an entrepreneur from London, has been nominated for the Entrepreneur of Excellence at The 2015 National Diversity Awards.

The ceremony celebrates some of the excellent and inspiring achievements of positive role models and community organisations from across the UK. The awards aim to recognise nominees in their respective fields of diversity including age, disability, gender, race, faith, religion and sexual orientation.

Tina Lannin owns 121 Captions, a company run by deaf people for deaf people, providing live captioning and subtitling services of the highest quality in 17 languages. She is passionate about accessibility for deaf people, being deaf herself. She obtained a university education without captioning support and used captioning herself in the workplace for several years, and has a wealth of knowledge to share about captioning, subtitling, and access for deaf people.

The National Diversity Awards 2015 in association with Microsoft will be held in Liverpool on September 18th. Britain’s most inspirational and selfless people will come together to honour the rich tapestry of our nation, recognising individuals and groups from grass roots communities who have contributed to creating a more diverse and inclusive society.

The largest diversity awards ceremony of its kind has attracted a growing list of top employers such as Sky, Financial Ombudsman Service and Price Waterhouse Coopers.

The prestigious black-tie event has also gained support from a number of celebrities including Stephen Fry, Misha B and Ade Adepitan.

Theresa McHenry, of Microsoft UK, said ‘The National Diversity Awards 2013 were thought provoking, humbling, inspiring, and not least, entertaining. This is the reason Microsoft are delighted to continue to be involved and have committed to sponsoring the National Diversity Awards 2015’.

Amongst last year’s winners was James Partridge, who spear-headed campaigns for social change and pushed for anti-discrimination protection. Jessica Huie took home the entrepreneur of excellence award for race, faith & religion, for setting up the UK’s most successful multicultural Greeting Card and Gift Company. Birmingham LGBT were also recognised for opening the first LGBT Health & Wellbeing Centre in England and Wales

The National Diversity Awards received an astonishing amount of nominations for last year’s event.

Paul Sesay, Chief Executive of The National Diversity Awards said, ‘It is an honour to witness the extraordinary journeys of Britain’s unsung diversity heroes, and we will continue to recognise their extraordinary achievements during 2015’.

‘I know another fantastic spectacle of role models will be delivered and recognised this year’.

Nominations are now open and close June 21st 2015 – so don’t miss out on your chance to get involved!

Shortlisted nominees will be announced shortly after this date.
To nominate Tina Lannin please visit National Diversity Awards or for a nomination form please email: emma@nationaldiversityawards.co.uk

What is CART?

CART stands for Communication Access Realtime Translation, an American term. In the UK, the equivalent is known as Speech-to-Text (STT). This is the true verbatim live transcription of speech – not voice recognition or respeaking, which are not verbatim.
If you’d like to find out more about CART or speech to text, contact us at bookings@121captions.com and ask for a demonstration and we can discuss your captioning needs.

 

Author: Rob Roth, AccessComputing staff

AccessComputing announces a new 7minute-34 second video, Communication Access Realtime Translation (CART) Services for Deaf and Hard-of-Hearing People, that explores what CART is and where it can be used. The video is ideal for anyone who is deaf or hard of hearing and is considering what types of accommodations would be best for participating in a college-level class or at a conference.The video was developed after it was discovered that few, if any, resources were available on the Internet to explain what CART and other captioning systems were. Four students from the Summer Academy for Advancing Deaf and Hard of Hearing in Computing, including two ASL signers, speak about why they chose captioning within a STEM educational setting.

Copyright © 2014 by University of Washington. Permission is granted to copy these materials for educational, noncommercial purposes provided the source is acknowledged. The AccessComputing project is funded by the National Science Foundation (grant #CNS-0540615, CNS-0837508, and CNS-1042260). Any questions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the federal government. We support the University of Washington’s online privacy statement and its terms and conditions of use.

Captioning a fundraising dinner

10710854_757775884294663_2195941348430922236_n

We were honoured to be captioning a fundraising dinner for Associatia Children’s High Level Group (ACHLG) in Belgravia a few days ago.  This charity helps disabled and deaf children in Romania to lead more fulfilling lives. This was a high profile event in the Romanian embassy, attended by Lords, Ladies and ministers. The 121 Captions team was very excited to have a night of fun with a Romanian banquet, musical performer Grigore Lese, Maria Raducanu, and a top chef flown in from Romania – and the entire event was live captioned by us.

A most important element was preparation before the event. We had to be sure that the event venue and organisers would facilitate our online caption service. We were sent names of the presenters at the event, scripts, agenda, and slides well in advance. This really helps the captioner to ensure she has all the words and names in her dictionary, as the captioning will be enhanced by adding short forms into her dictionary for these words before the event. She was then able to, for example, quickly and accurately write the name of the chef Mircea Dinescu and all the names of the Romanian wine and food that he spoke about before dinner.

The next step in our preparation was the sound check. A member of our team went to the embassy the day before and set up an audio test for the captioner, who was in the USA, with the gracious help of the Romanian embassy’s deputy director. We were all set for the exciting event.

Online captions streamed to a mobile phone during the auction

Online captioning streamed to a mobile phone during the auction

During dinner, we listened to speeches from the Baroness Nicholson of Winterbourne and the Minister of Education, among others. The auction was fast and furious! We were able to follow it all as the captioning only took 1 second to come through to our smartphones. It was fantastic to be involved and be able to join in, as most of the 121 Captions team is deaf. It was a really fun night and such an enjoyable way to help the charity Associatia Children’s High Level Group.

10711021_757775897627995_1057084376441634788_n

 

IMG_2245IMG_2249 IMG_2284

10703598_757775917627993_8213250674860835360_n

Tina Lannin (Director), Baroness Nicholson of Winterbourne, Suzie Jones (Training Manager)

A morning in the life of a CART writer

CART : Communication Access Real-Time Translation. Known as Speech-To-Text in the UK. 

First off, hello. My name is Jenn Porto. I’ve been a CART writer for approximately 7 years. My purpose is to share my fly-by-the-seat-of-my-pants situations that I encounter on the job. This post doesn’t make me an expert and does not mean that I’ll always make the right decisions. I may say/do something that makes you wince. I’m okay with that. There is no rule book for being a CART writer, and because we work “alone” on the job, we rarely get to share these moments and get POSITIVE feedback. With that said, I am not always going to be grammatically correct according to Morson’s English Guide. This is an account of my day and my thoughts as they come to mind.

August 13, 2014
Where’s my Advil! I left my house at 8 a.m. for a 10 a.m. job with only 27 miles to drive. You do the math. Welcome to the evil beast we call the “Los Angeles freeways.” It’s a good thing I left so early! Full of anxiety, I exited the freeway at 9:27. Every thought I had was followed by, “I should be setting up by now!”

“There’s my parking structure!!!” Shoot, a line! There’s a gigantic sign that reads: Government vehicles only. “Well, this could be a problem.” Rechecking the agency’s instructions, I confirm that I’m at the correct structure. I see a metered spot on the street open. “Great! Well, that never happens in LA.” Get that spot! Shoot, I can’t make a U-turn because I’m surrounded by government buildings and every police car in the city is lurking about. Quickly pulling out of the line, merging three lanes over to the turn lane, squeezing behind a postal truck, turn, turn, legal U-turn, swoosh into my parking spot. I jumped out and rush to the meter and, “Curses! The meter is broken!” This explains why the spot was open. I can’t risk a parking ticket. I get back in my car and back to the darn structure and darn line. Third in line.

9:35’ish, “Hi. Jennifer Porto. Here for the City Planning meeting.”
With a heavy accent, the parking attendant responds, “City Plant.”
“Ummmm, yup.” WTH is City Plant? Don’t know. Don’t care. “Thank you.” Off I go down the rabbit hole to the basement of the structure. Park. Pulling out my Stenograph bag, bag with monitor, Tory Burch grown-up high heels, and away I go hustling through the garage to find the elevator. I’ve gotten lost every time I’ve parked in this labyrinth of cars, so I was relieved to see a businessman making the same mad dash I was. “Follow him.”

9:45, 15 minutes to go. Up the elevator. Press the third-floor button with authority as if to urge this toaster on pulleys to move faster. “It’s so hot in this elevator. Where is my rubber band? Screw my bouncy curls. This is what being in a toaster feels like! Focus, Jennifer.” Doors open. I’m out. Turn left, right, through the bridge connecting the buildings. “Crap, I don’t have time for a metal detector.” Slap my bags down on the table and wait patiently for the guard to look up from his phone. Now, one would think that since I’m in the building, things would start to get easy — WRONG!

9:50, 10 minutes to go. I found the correct room and tried to slip unnoticed through the door with my hands full of Stenograph luggage — not happening. Bang, bonk, bang. “Sorry.” I sit in the back and scan the room for my client. He’s not here. “Will I get lucky and he’s a no-show?” Fingers crossed!

10:00, go time. The client is still not here. I had set up my equipment in the back of the room taking up three empty seats. There was a pole blocking my view of the board members. Don’t care. My setup includes a Mira, Dell laptop that sits on a laptop stand, and an extra monitor set up next to me. Everything is plugged in. “Boo-ya, I’m ready to start!” No need to Duct take wires down to the carpet at this point; the power plug was right behind my chair. With a second to breathe, I casually walked to the side of the room to gather the board member’s names. I usually get prep before meetings, but not this time.

10:15, still no client. I’m double stroking the board member’s names to make my speaker ID’s. SKWREUPL/SKRWEUPL = JIM:. The door opens slightly and I can see my client. “Hmmm, why is he not coming in?” So it turns out that he had requested a sign language interpreter to voice for him, and for the third time an interpreter did not show up. I offered to voice for the client, as I’ve done with this client in the past. He types what he wants to say and I verbalize his text. It’s usually not a problem — usually.

10:20, the planning administrator now tells me with a pleading look in his eye, “I know you’re set up in this room, but we’re moving to room 1060. Okay?”

With a smile and nod, “You got it. Let me gather my stuff.”

With a double step, I gracefully bust through the door. I breakdown my setup, clink, bonk, stuff in bag, click, clink. Mira and tripod, laptop lid shut, laptop stand and tripod, pen and pad all get placed in my bag — let’s be real, thrown in my bag – monitor is in a separate bag. I’m out! “Really!? Of course 1060 is on the opposite side of the floor, of course it is!”

Setting up for the second time, I finally sit next to my client and prepare to start. Nope, he doesn’t like my font. He is not verbal, so he writes me a note: Separate font. Crap, I don’t know what he means. I’m already double spaced. I motion to him with my fingers: Bigger? Smaller? He nods on bigger. Okay, I use CaseCat software. I have about 30+ different templates of different sizes and fonts to switch to. I switch to Arial 26. I write, “Test test.” Nope, he says something that I don’t understand. I changed my template randomly to find one he liked. Verdana 15? Nope. This happens at least five more times. Oh, and the planning administrator is sitting with his face in his palm looking from me to his paperwork, to me, to his paperwork, and now to me! Finally, the client mutters something that I understand, “The original.” Back to Arial 26.

10:26, and we’re off. My fingers were shaky from the anxiety of the morning. I kept mixing up the speakers every time they would switch. Which one is he? Ben? No, Jim? No, Ben? NO! It’s Jim! “Get it together.” I couldn’t remember my brief for applicant or advisement. My pink blouse is now stuck to my back. I am sure I had the just-woken-up look with makeup under my eyes and a droopy ponytail. I was focused, “Write every word perfectly. No misstrokes.” The client was glued to my screen.

The planning administrator finally opened the forum for public comment. My client was reading every word I wrote. “Does anyone wish to make public comments?” After a long pause while he was reading, he shoots his hand up. Now comes the dance as we attempt to take turns with my laptop. He would be furiously pecking away to write his comments and the board members would start talking. Although, I couldn’t write what they were saying while my client was typing, because the real time would switch to the bottom of the screen and the client’s thought would be lost. I raised my hand to gesture a “pause.” I waited for my client to finish typing his response. Then I wrote what the board said. Then voiced what my client typed out. It was confusing. Thankfully, my client kept reiterating the same thing: He wanted a postponement due to not having a sign language interpreter.

One last thought, this was the third time a sign language interpreter did not show up and did not give notice. This is now three times a planning hearing for this matter has been set up a month out, the representatives from a HUGE well-known company have come prepared with experts and strategy, the planning administrator has conducted and postponed a planning hearing, a CART provider has been arranged, and the client shows up to debate this matter. How is it that a sign language agency has screwed up three times and not sent out an interpreter nor arranged a sub? Baffles my mind. Crap does happen, you get a sub, and you get the job done! Period! This is not a job where you can be lackadaisical and call out sick without arranging a sub. Please let this be a lesson to all of us: Get a sub. Oh, and leave your house more than two hours early, if you are driving on the beastly LA freeways, ha!

10:46, we adjourned.

Source: Jenn Porto

How smart subtitling technology could help deaf people

Author: William Mager | Original article: BBC News | Reproduced with kind permission of the author

subtitling on Glass

Subtitling & new technologies

There are many new technologies that can help people with disabilities, like live subtitling 24/7 for deaf people, but how well do they work?

Deaf people always remember the first time a new technology came on the scene, and made life just that little bit easier in a hearing world.

I’ve had many firsts. Television subtitles, text phones, the advent of the internet and texting all opened up opportunities for me to connect with the wider world and communicate more easily.

So when I first heard about Google Glass – wearable technology that positions a small computer screen above your right eye – I was excited. Live subtitling 24/7 and calling up an in-vision interpreter at the touch of a button. Remarkably both seemed possible.

That was a year ago. Since then, Tina Lannin of 121 Captions and Tim Scannell of Microlink have been working to make Google Glass for deaf people a reality. They agreed to let me test out their headset for the day.

First impressions

First impressions are that it feels quite light, but it is difficult to position so that the glass lens is directly in front of your eye.

Once you get it in the “sweet spot” you can see a small transparent screen, it feels as though it is positioned somewhere in the distance, and is in sharp focus. The moment you get the screen into that position feels like another first – another moment when the possibilities feel real.

But switching your focus from the screen to what’s going on around you can be a bit of a strain on the eyes. Looking “up” at the screen also makes me look like I’m a bad actor trying to show that I’ve had an idea, or that I’m deep in thought.

The menu system is accessed in two ways. There is a touch screen on the side which can be swiped back and forth, up and down, and you tap to select the option you want.

Screenshot of Google Glass screen with subtitling

Google Glass can be used for live subtitling

Or you can control it by speaking, but this can be difficult if you’re deaf. Saying “OK Glass” to activate voice commands can be a bit hit and miss if your voice is not clear, like mine.

One of the main problems is the “wink to take a picture” setting. But I wink a lot. I also blink a lot. So I turn that setting off.

After a few minutes of reading online articles via Glass, it’s time to test out live remote captioning software in the real world. Lannin and Scannell’s service is called MiCap, a remote captioning service that works on several platforms – laptop, tablet, smartphone, e-book and Google Glass.

We set up in a quiet meeting room. After some fiddling with wi-fi and pairing various devices, we put a tablet in the middle of the table as our “listener”, and put the headset on. As three of my colleagues engage in a heated discussion about the schedule for programme 32 of See Hear, the remote captioner, listening somewhere in the cloud, begins to transcribe what they are hearing.

subtitling

My first reaction is amazement. The captions scrolling across the screen in front of my eye are fast, word perfect, with a tiny time delay of one or two seconds. It is better than live subtitling seen on television, not to mention most palantypists who convert speech to text. I can follow everything that is being said in the room. Even more impressively, this is the first time that the app has been tested in a meeting. I can look around, listen a bit, and read the subtitles if I miss something.

But after a while, tiredness overtakes excitement, and I take the headset off.

See Hear is broadcast on BBC Two at 10:30 GMT on Wednesdays – or catch up on BBC iPlayer. The new series begins on 15 October 2014.

The headset itself is uncomfortable and fiddly, but despite this my first experience of Google Glass was enjoyable. It doesn’t offer anything that I can’t already do on my smartphone but the ability to look directly at someone at the same time as reading the subtitles, does make social interaction more “natural”.

Future developments

I am excited about the apps and software being developed by deaf-led companies in the UK. Not just remote captioning – also remote sign language interpreting. UK company SignVideo are already the first to offer live sign language interpreting via the Android and iOS platforms, and say that they’ll attempt a Google Glass equivalent in the future if demand is high enough.

Other companies such as Samsung and Microsoft are developing their own forms of smart glass and wearable technology and as the innovations reach the mainstream the range of applications which could help disabled people seems likely to grow.

There are lots of exciting tech firsts to come but I still prefer a more old-fashioned technology – the sign-language interpreter. They’re temperamental, and they might make mistakes too, but they’re fast, adaptable, portable – and they don’t need tech support when things go wrong.

Follow @BBCOuch on Twitter, and listen to our monthly talk show

Captioned theatre: Tosca

tosca captioned theatre

If you’re an opera lover and always wanted to see an opera with live captioning, you’ll be delighted to know that Soho Theatre are reviving their celebrated partnership with OperaUpClose with a production of Puccini’s masterpiece, Tosca. There will be a captioned theatre performance on Saturday 31st August at 3pm and early booking is strongly recommended.

The story
East Germany, 1989. In the shadow of the Berlin wall, Tosca, a fiery yet vulnerable singer, is the toast of the ruling communist élite. When her lover helps a political prisoner escape, Tosca’s world is torn apart, leaving her at the mercy of Scarpia, chief of the Stasi secret police.

Tosca is a taut thriller set in the dying days of communism, as the government turns to terrifying levels of surveillance and intimidation in a desperate bid to cling to power.

Acclaimed for its radical interpretations, OperaUpClose brings its trademark intimacy and immediacy to Tosca’s story of love, loyalty and corruption accompanied by a boutique trio of piano, cello and clarinet.

Booking details
Date: Saturday 31 August, 3pm
Tel:  020 7478 0100 (press 3)  Email: box1@sohotheatre.com
Tickets: £15.00

Venue: Soho Theatre, 21 Dean Street, London W1D 3NE
Soho Theatre