From old school VCRs to DVD players, HDTV, Blu-ray players and Netflix, closed captioning is all around us. But whether we use captions for foreign film translations or to understand TV dialogue, its variety of uses begs the question: What exactly is closed captioning?

Closed captions display spoken dialogue as printed words on a screen. Captions are often used by those with hearing loss so that they may participate with others also watching TV, for example. They are also beneficial for English language learners (ELLs) as well as adults and children learning to read. Subtitling differs from closed captioning in that subtitles are used specifically for translation of the dialogue and are often burned into a picture so that they cannot be turned off, whereas captions can either be burned into the video or a viewer can choose whether to turn the captions on or off.

Nicole Coffey of explains closed captioning and its tie to the sports industry in an article that includes a video featuring Scott Pentoney of the ESPN Program Compliance Team, the group that organizes, tracks and reports all captioning on every ESPN domestic network. ESPN coordinates and schedules captions for all of their TV programming as well as their video on demand (VOD) system and Caption writers program the closed captions and then send the captioning data to the network’s programming team. The programming team then embeds that data into the video signal to be broadcast.

Not two decades after closed captioning was popularized for TV and movie screens, Sony came out with a new innovation that changed the game for closed captioning. In 2013, the company released their Sony Entertainment Access Glasses, special closed-captioning glasses that enable more moviegoers who are deaf and hard-of-hearing to actually go to theaters. The captions are projected onto the glasses using a holograph and appear under the movie screen for the viewer to read. The glasses also come with added features such as audio adjustment levels for those who are hard-of-hearing as well as movie audio tracks for people who are blind. With their Access Glasses, Sony modernizes the captioning process and makes it more real-time. Certain theaters have these glasses available, including some Regal Cinemas across the country. Speaking of real-time interaction, Google’s famous Google Glass also has a fantastic new app for captioning conversations for those who are hard-of-hearing.

With over 36 million Americans having some type of hearing loss in 2012, the demand for captioning continues to be relevant. Thanks to dedicated captioning specialists as well as new and emerging technologies, hard-of-hearing Americans have more avenues than ever to acquire information and enjoy entertainment with less of a struggle.

Did You Know?

Sports programming is one of the hardest genres to caption because of its specific vocabulary, fast-paced nature and long rosters. People who program captions, particularly for sports and other live, real-time events, are usually trained as court reporters. Both court reporters and closed caption writers use a stenotype keyboard, and the captioning language is similar to texting in that it uses many specific abbreviations, acronyms and other shorthand styles. According to Scott Pentoney from ESPN, a typical captioner (also called a speech-to-text reporter) types about 200 words per minute for a one-hour live show.