April 26, 2018

Study explores accuracy of residency applications

The misrepresentation of scholarly work among residency applicants has been documented for decades, said Louise Mawn, MD, professor of Ophthalmology and Visual Sciences at Vanderbilt University Medical Center.

 

The misrepresentation of scholarly work among residency applicants has been documented for decades, said Louise Mawn, MD, professor of Ophthalmology and Visual Sciences at Vanderbilt University Medical Center.

Louise Mawn, MD

Every year residency programs grapple with unverifiable publications and other citings included on applications. Despite the unethical practice, there has been no resolution, until the release of the study “Rate of Unverifiable Publications Among Ophthalmology Residency Applicants Invited to Interview” in JAMA Ophthalmology.

“The really nice part of this investigation and what makes our publication unique is that we identified a challenge and potential ways of addressing the issue in the future,” said Mawn, principal investigator of the study. “This was a significant finding that we knew was important to communicate to a wider body.

“Although others have identified unverifiable publications as a problem, no one has offered a solution. We are the first to land on a solution, and a really implementable one.”

Armed with knowledge that many prospective residents submit applications with unverifiable publications, Mawn and colleagues set out to determine the rate of this occurrence among the pool of applicants at the Vanderbilt Eye Institute.

Heather Tamez, MD, current chief Ophthalmology resident at VEI, analyzed 322 residency applications for entering classes from 2012 to 2017.

She reviewed the publications reported in the applications and searched to verify them for authenticity.

It was determined that 74 percent of the applicants reported at least one published work. Of that group, 9 percent had unverifiable publications.

Ophthalmology residents use a centralized application from SF Match Residency and Fellowship Matching Services. It includes an area for applicants to include any research activities or projects.

It is here that Tamez, a co-investigator, found discrepancies.

“Students are left to their own devices to define or explain their work,” said Tamez. “There is that potential that they did not understand the way they were required to list their work and it could have been confusing to them.

“I would really like to see the SF Match application improved to better define what applicants need to provide.”

Study investigators hope their findings change the way information is submitted and that applicants be required to include the number assigned to the article published in a peer-reviewed journal. Supplying that identifying information, called a PubMed Central reference number (PMCID) or DOI (digital object identifier), would make it easier to verify.

“Not only would it help in the verification process, but it may also deter applicants from putting an unverifiable work on his or her application if he or she was not able to provide this identifier,” said Tamez.

Although the study recommends a change in the SF Match review process, Mawn said VEI will begin asking for supplemental information from resident applicants and screening them differently.

“We can change what we do immediately,” said Mawn. “In our publication we provide a tool to help limit the problem of application falsification by requiring applicants to document the locator information for their peer-reviewed articles. This practice may decrease these occurrences.”