Skip to main content

Natural language commands vital to making EHRs seamless, ‘delightful’

Jul. 18, 2019, 9:45 AM

Yaa Kumah-Crystal, MD, MPH, MS, envisions a world in which she and her fellow health care providers work seamlessly with technology to educate patients on how to manage chronic conditions. (photo by John Russell)

Next Up is a series featuring VUMC’s next generation of thinkers, leaders, providers, operators and explorers.

by Holly Fletcher

Yaa Kumah-Crystal, MD, MPH, MS, self-describes as an early adopter of technology who is seizing on the fast embrace of natural language recognition systems — i.e. Siri and Alexa — to usher in an era of medicine where the tech is just as responsive.

Kumah-Crystal, assistant professor of Biomedical Informatics and Pediatric Endocrinology at Vanderbilt University Medical Center, is working on a project piloting in the Division of Pediatric Endocrinology that takes voice commands to retrieve information from the electronic health record (EHR). She wants to make sure that tech is designed to work with clinicians in a way that’s organic and helpful to add to the patient-doctor relationship, rather than detract.

She’s passionate about why people should demand more out of computers than current expectations, and envisions a world in which she and her fellow clinicians work seamlessly with technology to educate patients on how to manage chronic conditions.

People are open to rethinking the computer’s job in the exam room because of the evolution and saturation of the smartphone and virtual assistants such as Alexa, Kumah-Crystal said.

“Until recently people thought this was science fiction — it’s not, but they had to see it to understand it. I think people have a little more faith that even us here in the medical field could have technology that works seamlessly as an extension. It takes time, and it takes building confidence,” she said.


Computers are a necessary, and sometimes evil, part of life. What’s your goal?

I’ve had this idea for some time about how could we use natural language and commands like we do with Siri and Alexa to interact with health care records to get information out. When someone asks for something in the EHR, we should expect it to give the information doctors need to make the next decision. No more, no less.

And we know that’s going to be different if you’re inpatient or outpatient, or if you’re an obstetrician or an orthopaedist.


What could that look like?

I have a patient sitting in front of me with her mom who is very concerned about her child and is pouring her heart out about symptoms. Is the computer stealing my attention from that interaction? Or can I ask it to pull up the growth chart so we can look at it together or can it pull up a thyroid gland picture so I can explain it? Is it taking notes and is it trained to know what’s relevant to us, and is it equipped to spit back the information you need when you want it? It can be.

There’s this concept that when you’re using technology with another person in the room it has to be directly benefiting them, otherwise it’s stealing their time.

But right now, no one has taught the computer to do that. I feel like it’s the job of an informatician to teach a computer how to interact with humans.


Informatician — that’s an interesting word in this context. Why should people want to be one?

It means someone who seeks to understand data and seeks to apply data science principles to help humans function optimally.

Hopefully we will turn the tables on how we’re currently using our computer science interfaces and actually make it not just more efficient, but more delightful. We go into medicine because we want to help people and I want computers to help us do that — to help us help patients manage these struggles.

Computers don’t get tired. They don’t have days off or get distracted by the news. We can train them to push care forward.


Has technology hit the point where it can modify to us rather than us constantly modifying to it?

That’s a good question, and I think it depends on the domain. In the consumer realm, the iPhone is the pivotal example of good user design. It was functional and it made sense. Whereas in medicine, we’re still getting there because the incentives are different. Apple wants to sell iPhones so they make the product highly desirable. In medicine, we’re there to take care of the patient.

I know we can do a better job of designing technology to help take care of patients, which is what we’re here to do. Computers can do so much more than we’re letting them do, and I think we’re limited by our own imaginations and our own ability to be flexible as much as we are our conceptions of what computers should be doing.


How did you get interested in this?

I was a biophysics major in college and we were required to do protein modeling, so I learned basic computer science principles and I thought it was cool — I could talk to computers in a figurative way and get them to do what I wanted. It demystified how computers work. I want people to know they aren’t magical boxes — code is literally ones and zeros. We can make them do what we want them to do. It just takes imagination and designing it with the end user — the doctor — in mind.

Recent Stories from VUMC News and Communications Publications

Vanderbilt Medicine
VUMC Voice