Tech & Health

June 16, 2022

Researchers add sign language videos to electronic surveys

A new external module in Vanderbilt’s REDCap improves electronic survey accessibility for respondents who are deaf. The module makes it easy to embed individual sign language videos for each survey question and answer.

A new REDCap external module improves electronic survey accessibility for respondents who are deaf. The module makes it easy to embed individual sign language videos for each survey question and answer.

The module, called the Rochester Accessibility Survey, was created by a team at the National Center for Deaf Health Research (NCDHR), part of the University of Rochester Clinical and Translational Science Institute, working with the Data Core team at the Vanderbilt Institute for Clinical and Translational Research (VICTR).

REDCap (short for Research Electronic Data Capture) is a secure web application for building and managing online surveys and databases. Geared to support online and offline data capture for research studies and operations, the application was launched at Vanderbilt University Medical in 2004 and now has nearly 6,000 licensed REDCap Consortium organizations 145 countries.

“We were thrilled to work with the NCDHR team on this initiative to fill an important need, empowering both researchers and research participants,” said REDCap’s creator, Paul Harris, PhD, professor of Biomedical Informatics, Biomedical Engineering and Biostatistics and director of the Office of Research Informatics at VICTR. “The Rochester team has decades of experience working with Deaf and DeafBlind communities, so we were grateful for the opportunity to collaborate.”

VICTR Data Core team members instrumental in developing and disseminating the module include application developers Carl Reed and Mark McEver.

For more information, see the NCDHR blogpost about the project.