Deaf community to Elon Musk: Subtitles are not the same as sign language
Posted February 23, 2024 4:08 pm.
A recent post by Elon Musk, CEO of X, formerly Twitter, provoked both anger and exasperation from members of the deaf community.
The post was in response to a video that had subtitles as well as an American Sign Language (ASL) interpreter on screen.
Musk asked “What’s the point of sign language in a video if you have subtitles? Am I missing something?”
Many commenters agreed with him, but much online reaction called the post ignorant and uneducated.
What’s the point of sign language in a video if you have subtitles? Am I missing something?
— Elon Musk (@elonmusk) February 6, 2024
“I thought perhaps he just genuinely didn’t know. But then based on a number of comments that followed, I thought this is a general understanding,” says Leah Riddell, President of the Ontario Cultural Society of the Deaf, with ASL interpreter Jennifer McConachie.
“I really want to take a look at it from a positive angle. And I want to hope that he’s coming from a place of curiosity and this is therefore an opportunity for those to learn from it. Yes, it evokes a lot of anger, but we have to continue educating.”
Riddell says she and many in the community are using this as a teaching moment to highlight the importance of providing ASL interpretation, as she explains that English subtitles and ASL are neither the same nor interchangeable.
“ASL is its own unique language with its own grammatical structure, comprised of cultural nuances and syntax. There are accents, there are tones, there are emotions conveyed. Seventy per cent of the language is conveyed through facial expressions and body movement. And so ASL is not English. They are very different languages,” she says.
Riddell adds that text and subtitles do not convey all the necessary information.
“Captions don’t always capture tone and emotion – it’s very two dimensional. There are no other nuances. When you see sign language, it’s conveyed on the face and in the facial expressions. And so we can gather tone that way through a three-dimensional feature versus a two-dimensional feature … we see that in the body language. We can understand pitch and tone that way,” she says.
“If somebody says the word ‘yes,’ what’s behind that? Is it a definitive ‘yes’? Or is it a (non-commital) ‘yes’? That can be conveyed in a number of ways, and by showing that through sign language, we can best [understand] their intent.”
Riddell says any efforts towards being more inclusive must incorporate ASL interpretation as part of an overall strategy.
“We talk a lot about DEI – diversity, equity, and inclusion – but we’re neglecting accessibility … access to language is what’s lacking … it’s not about convenience, it’s about the inclusion of accessibility. We have a diverse population. We want to ensure people can participate, contribute, and feel a sense of belonging and not be left out. And that’s where we have to understand the need to provide accessibility to enable inclusion,” she says.
She adds that most content or events held by the deaf community provide access to hearing individuals.
“We provide captions, we make our content with voiceover, we provide translation services. Why isn’t the broader society meeting us halfway?” she asks.
Riddell says when it comes to accessibility, there is no one-size-fits-all method.
“We have to ensure that people get what they need and not just limit it to their audiological abilities or their ability to speak. You need to reach out to those with lived experience and ask them what it is that they need,” she says.
“When it comes to media, provide it in all formats where possible with captioning, interpreting services as well as the spoken component … this really does send a powerful message of acceptance and respect for every individual and their communication needs.”
Thanks to Jennifer McConachie from Asign Interpreting & Translation Services, Patti McFarlane, Mariola Trzcinska and Kelly Hayes for their interpretation contributions to this story