Skip to Main Content
Skip Nav Destination

Deaf scientists thrive with interpreters and technology

23 July 2021

Though still underrepresented in STEM, deaf and hard-of-hearing scientists are excelling in their fields and developing ways to more seamlessly communicate with their colleagues.

Johanna Lucht observes data from mission control at NASA’s Armstrong Flight Research Center.
At NASA’s Armstrong Flight Research Center, Johanna Lucht (right) observes mission control data on her left monitor, while an American Sign Language interpreter provides flight communication information on her right monitor. Seated to her left is lead systems engineer Keith Schweikhard. Credit: Lauren Hughes/NASA

On 4 April 2017, NASA engineer and programmer Johanna Lucht sat in front of her monitors at the Armstrong Flight Research Center’s mission control center in California, observing the flight data coming from a Gulfstream III jet. Two of her computer screens displayed GPS and navigation systems data. On a third screen, an American Sign Language (ASL) interpreter communicated in real time between Lucht, her colleagues in the control room, and the flight team.

That day Lucht became the first deaf engineer to work in a NASA control center during a crewed research flight. Though she is far from the first deaf scientist to work at NASA or in computer engineering, scientists with disabilities, including those who are deaf or hard of hearing (D/HoH), are underrepresented across STEM fields. Only 1.4% of the nearly 30 000 respondents to the 2019 National Center for Science and Engineering Statistics survey of STEM doctoral recipients reported being D/HoH. That rate is about half the number one would expect on the basis of demographics. Sixty-seven of those newly minted PhDs were physical scientists, and only 9 were geoscientists.

Yet for working D/HoH scientists, access to technology and interpreters has lowered barriers that historically have threatened the ability to thrive in a hearing world. For example, Michele Cooke, a geoscientist at the University of Massachusetts Amherst, wrote in 2019 about how the proliferation of captioning technologies that use artificial intelligence has expanded her access to webinars. “For the first time, I can understand a webinar,” she wrote. We asked Lucht and Giordon Stark, a deaf experimental physicist at the University of California, Santa Cruz, to describe their research and the steps they’ve taken to improve communication with their fellow scientists.

Increasing accessibility at NASA

Until the age of 9, Lucht had no language skills in ASL or other forms of communication—a pervasive experience for many in the D/HoH community called language deprivation. But she found an early connection to the world through mathematics. Lucht has said in interviews that she was drawn to the visual, “intuitive” nature of math.

While pursuing a computer science degree at the University of Minnesota Twin Cities, she started an internship at NASA Armstrong’s research and engineering department. She was hired to stay on at the agency, where she currently prepares software for aircraft test flights and develops user interfaces for displaying ground test data, in addition to working in the control room as a systems engineer.

The Gallaudet Eleven

Members of the Gallaudet Eleven prepare for zero-g flight.
Members of the Gallaudet Eleven prepare for zero-g flight. Credit: Gallaudet University Archives, David Myers Collection

In the early days of NASA, deaf and hard-of-hearing scientists and individuals were working at the cutting edge of human spaceflight.

In the late 1950s, the agency was seeking to understand the impacts of gravity on the human body. Because of motion sickness, hearing astronauts were not able to withstand g forces long enough for NASA scientists to capture useful data. So NASA selected from Gallaudet University—the first institution of higher education specifically designed for D/HoH students—11 deaf men who were immune to motion sickness because of inner ear damage caused by spinal meningitis. Those participants informed the development of a baseline for human tolerance of g forces that is still used to test astronaut candidates today.

In April 2017, the same month Johanna Lucht sat in mission control for the Gulfstream III flight, the Washington, DC, university and NASA unveiled a museum exhibit to honor the Gallaudet group’s contribution to human spaceflight. Known as the “Gallaudet Eleven,” the participants were Harold Domich, Robert Greenmun, Barron Gulak, Raymond Harper, Jerald Jordan, Harry Larson, David Myers, Donald Peterson, Raymond Piper, Alvin Steele, and John Zakutney.

Lucht says that at school she had easy access to interpreters who were comfortable with technical classes. But when transitioning to a work environment, she faced challenges in building a new support network and accessing technical interpreters. When physically located at NASA, Lucht’s access “used to mean borrowing an interpreter from the Air Force,” she says. That situation was not ideal because her work often includes supporting flight tests from 7am to 5pm every day, and “the interpreter had something like seven clients to support on the Air Force side.” NASA needed to hire its own interpreter.

In Lucht’s original desk setup, her monitors blocked her view of any interpreters. The mission control team worked with her and found that a video feed of the interpreters on her monitor was the best approach to use in the control room. Lucht’s colleagues “realize that when things are inaccessible to me, they cannot access me and utilize the skills I’m there to provide.”

She still has to navigate communication challenges in her daily work with her hearing colleagues, particularly when articulating highly technical information. She notes that when communicating with hearing people, “I lean toward Conceptually Accurate Signed English [CASE] a little to help a nontechnical interpreter get terminology correct.” CASE is a communication method that draws signs from ASL—which itself is grammatically different from English—and uses them with English grammatical rules and constructs while excluding words that add little additional understanding to the conversation, such as the or a. The process is smoother when the interpreter is well versed in NASA’s technical terminology. When all else fails, Lucht says she types information on her laptop.

When Lucht talks to other deaf people or colleagues, “I use ASL all the way, with occasional finger spelling of technical terms—usually a product name or an acronym.”

Adding physics vocabulary to ASL

Stark, a postdoctoral researcher at UC Santa Cruz, searches for new physics with CERN’s ATLAS collaboration and works to improve the experiences of D/HoH physical scientists and the interpreters who support them.

“In my case, the largest issue is usually interpreter burnout from dealing with more technical concepts, jargon, and accents for a longer period of time,” he says, pointing to the hurdles he’s faced at international physics conferences. He notes that in those situations he tries to provide real-time feedback to interpreters so they feel comfortable using terminology and jargon that are unfamiliar to them. Rotating interpreters or giving interpreters breaks between events also helps.

Many interpreters are afraid of technical interpreting, he explains; when they don’t understand the content, they worry about making mistakes. As a result, D/HoH STEM students and professionals can face obstacles from a shortage of technical interpreters, including limited opportunities to network with hearing colleagues or requiring additional time outside of class to meet with an interpreter.

On-the-job experience is crucial for interpreting because ASL, like other languages, is constantly evolving. At the 2019 American Physical Society Division of Particles and Fields meeting, Stark shared some of his diversity and inclusion work aimed at developing new signs for physics terms to use with interpreters. At the time, he was partnering with ASL experts and other deaf scientists on ASLCORE, a community platform for the development and adoption of signs, including those for physics terms such as neutrino and particle collision.

American Sign Language signs for science, technology, engineering, and mathematics.
Four people sign science, technology, engineering, and math in American Sign Language. Credit: ASL Clear

The project is among several ongoing efforts to add science terms to ASL. Groups such as ASL Clear, which has partnered with Worcester Polytechnic Institute in Massachusetts, provide science lectures in ASL and other resources to the D/HoH community. A separate effort by the Learning Center for the Deaf and Harvard University aims to create quantum physics signs.

The ASLCORE project faced challenges, Stark says, because of hurdles that included “trying to develop a language for a community that still suffers from significant language deprivation in the first place.” But he continues working on making physics more accessible, including by serving on the US ATLAS Diversity and Inclusion Committee. The goal, he says, is to foster the next generation of deaf physicists and scientists.

Editor’s note: This article uses the lower-case term deaf and hard of hearing so as not to assume that all deaf or hard-of-hearing people are culturally Deaf, meaning that they embrace cultural norms, beliefs, and values of the Deaf community. Exceptions occur where the term appears in a quote or in the name of an organization.

Close Modal

or Create an Account

Close Modal
Close Modal