January 1, 2009

Canary in the research lab

Illustration by Ken Perkins

There are ominous signs that all is not well in the nation’s biomedical research enterprise.

Thanks to five years of flat budgets at the National Institutes of Health, which supports the bulk of basic biomedical research in the United States, only about one in 10 research proposals is funded on the first submission, down from 30 percent a decade ago.

As a result, young scientists increasingly are leaving university research labs, taking jobs in other countries like Singapore that have more robust research budgets, or are leaving science altogether. Perhaps most ominous: U.S. students rank below their peers in other—mostly Asian—countries when it comes to mastery of math and science.

As Microsoft founder Bill Gates put it to the House Committee on Science and Technology last March, “Too many of our students fail to graduate from high school with the basic skills they will need to succeed in the 21st century economy… Although our top universities continue to rank among the best in the world, too few American students are pursuing degrees in science and technology.”

“My fear is we’re going to lose a generation of young investigators,” adds William Lawson, M.D., assistant professor of Medicine at Vanderbilt Medical Center. Lawson contributed to a March 2008 report by a consortium of academic health centers, including Vanderbilt, which warned that the nation’s research “pipeline” may be “broken.”

Some observers dispute this “sky is falling” reaction. By many measures, from R&D spending to the annual number of highly cited publications, from the reputation of its universities, to the lion’s share of Nobel laureates who work here, “the United States still leads the world in science and technology,” conclude RAND Corporation scientists Titus Galama, Ph.D., MBA, and James Hosek, Ph.D., in a 2008 report, “U.S. Competitiveness in Science and Technology.”

That may be true, but these statistics are, in a sense, “historical,” replies Lawrence Marnett, Ph.D., director of the Vanderbilt Institute of Chemical Biology.

“This (report) reflects that you’ve got people who have been around a long time… who are in the prime of their careers and well supported for a long time and are doing a great job,” Marnett says. “Looking at the future, it’s very clear to me that we can’t even judge how many… young people are turned off on the whole notion of going into science.”

There may be, as yet, few signs of calamity in the research lab, but there certainly is no lack of voices sounding the alarm.

“It’s very discouraging to hear that many of our best young scientists are 41 or 43 years old when they receive their first grant,” says Douglas Melton, Ph.D., co-director of the Harvard Stem Cell Institute who is searching for a cure for type 1 diabetes. “It’s just unreasonable for society to expect that people would devote 10 to 15 years to their education and then not be given real independence until they were in their 40s.”

Jack Dixon, Ph.D., HHMI’s chief scientific officer, warns against “burning up talent.”

Paul Fetters for HHMI

“For young people, these are the most productive, innovative and exciting times in their careers,” adds Jack Dixon, Ph.D., chief scientific officer of the Howard Hughes Medical Institute and former dean for Scientific Affairs at the University of California, San Diego. “If they’re spending a lot of their time on the treadmill of writing and submitting and rewriting and resubmitting, then they’re not spending their time doing experiments that will lead to breakthroughs in innovation.

“You can’t let talented young scientists go unfunded for five to eight years. They will end up washing out of the system. You are basically burning up talent that took many years to develop. In the Midwest,” says Dixon, who formerly taught at Purdue, Indiana University and the University of Michigan, “we call that ‘eating your seed corn.’”

Inadequate funding squelches creativity, especially for young investigators, Lawson continues.

“When we put a grant in, one of our biggest criticisms is ‘you are being too ambitious’ or ‘you’re being too creative,’” he says. “This isn’t just a matter of losing people. We run the risk of stifling or slowing down our discoveries because investigators are being told to avoid riskier ideas and pursue more predictable avenues with their research. Essentially, at times we have been told that, ‘You need to back off. You need to make this a safe research plan.’”

Despite its limitations, the NIH has tried mightily to support innovative research.

In 2004, under then-NIH Director Elias Zerhouni, M.D., it launched the Pioneer Awards, $500,000, five-year grants to selected recipients, which gave them the rare chance to conduct high-risk, high-impact, high-potential biomedical research.

Steven McKnight, Ph.D., professor and chair of biochemistry at the University of Texas Southwestern Medical School, was one of the first Pioneer awardees. He used the funds to investigate the metabolic cycle of yeast, later transferring those discoveries to the brain in hopes of laying the groundwork for understanding what biochemical reactions drive sleep and exhaustion in living organisms, and what molecular processes are restored by the act of sleeping.

Vanderbilt’s Mark Magnuson, M.D., says NIH is the “most successful agency.”

Photo by Joe Howell

This kind of research, McKnight says, might not be “sexy,” but it may be extremely important.

Zerhouni also implemented the Young Innovators Award, based on the concept that the NIH should expand beyond its standard policy to promote higher-risk research projects.

Survival of the fittest?

Marnett calls the NIH the “the crown jewel” of the U.S. government in terms of its effectiveness. Adds his colleague, Mark Magnuson, M.D., who directs the Vanderbilt Center for Stem Cell Biology, it’s “probably the most successful agency that the United States government ever created in terms of its impact and how well it spends money.”

Vanderbilt’s Nancy Brown, M.D., says scientists must tell their story better.

Photo by Joe Howell

And yet because its purchasing power has dropped by 13 percent due to inflation since 2003, the NIH is less able to do what it used to do so well. The scarcity of funding is particularly troublesome for young scientists.

“It’s been said that in the past we raised physician-scientists like guppies where you gave birth to thousands of them and a few survived,” says Nancy Brown, M.D., chief of the Division of Clinical Pharmacology at Vanderbilt.

“But when you have limited resources, it’s harder for those guppies to survive,” Brown says. “There are more predators. What we really need is a model like raising mammals, where you nurture people along.”

It’s an apt analogy. Not only is the next generation of scientists at risk, but so is the next generation of their ideas—and the potential breakthroughs that could result.

“What the lack of funding always does in any ecosystem is halt innovation,” argues Jeff Balser, M.D., Ph.D., dean of the Vanderbilt University School of Medicine and associate vice chancellor for Health Affairs.

“If the NIH isn’t funding the less innovative sciences, the universities can, but they can only do that for so long because there’s only so much money,” Balser says. “At the end of the day, the impact of a declining NIH budget is less money for pursuing highly innovative ideas.”

In May 2005, U.S. Senators Lamar Alexander (R-Tennessee) and Jeff Bingaman, D-New Mexico, asked the National Academies, which include the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine, to assess the nation’s “ability to compete and prosper in the 21st century.”

In response, the academies formed a blue-ribbon committee of experts in a wide range of fields, from engineering to genetics.

The committee’s report, “Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future,” was submitted to Congress five months later. It called for, among other things, recruiting 10,000 science and math teachers each year, encouraging young people to earn college degrees in science by providing scholarships, and increasing federal investment in long-term basic research.

“The most important barrier that must be surmounted is the poor science knowledge of the teachers who are responsible for teaching science to our students,” says Roy Vagelos, M.D., retired chairman and CEO of Merck & Co. who served on the committee that produced the report.

“Many of the secondary school teachers of biology, physics, and chemistry have not majored in those subjects; some have never had a major course in the subject they are teaching,” says Vagelos, a member of the board of the National Math and Science Initiative. “The same can be said for all of the sciences.”

The report led to passage in 2007 of the America COMPETES Act, which among other things would increase recruitment and scholarship funding for future K-12 science and math teachers, advance the knowledge base of existing teachers through continuing education programs, and provide more science and research opportunities to middle and high school students. As yet, however, the law has not been fully funded.

Another challenge to the nation’s research enterprise cited in the 2005 report: visa laws that restrict foreign-born, American-trained scientists from taking jobs in the United States.

Given that immigrants have long infused lifeblood into American science—in 2008, for instance, the Nobel Prize for Chemistry went to Chinese-American Roger Tsien, Ph.D., and the Nobel Prize for Physics to Japanese-American Yoichiro Nambu, D.Sc. — the report questioned the wisdom of rewarding brilliance in the laboratory by sending scientists back to their home countries, many of which don’t have the resources to support high-level research.

“There are lots of ways that we in the United States are shooting ourselves in the foot,” says Bruce Alberts, Ph.D., professor of biochemistry and biophysics at the University of California, San Francisco, and past president of the National Academy of Sciences.

30 to 1 return

Why does it seem so hard to make a case for investing in biomedical science?

After all, notes John Oates, M.D., professor of Medicine and Pharmacology at Vanderbilt, thanks to basic biomedical research, rheumatic heart disease “almost doesn’t exist in the United States today because of antibiotics.” The iron lung, which enabled victims of polio to breathe, has been retired.

Among the most dramatic success stories is the treatment of cardiovascular disease, which, in the past 20 years, has reduced the annual heart disease death rate by 600,000—and all for a research investment of $30 per American, Hamm says.

Writing in the 2003 book, “Measuring the gains from medical research,” Harvard economics professor David Cutler, Ph.D., and graduate student Srikanth Kadiyala estimated a 30 to 1 return on that investment. “Our unambiguous conclusion is that medical research on cardiovascular disease is clearly worth the cost,” they concluded. New cancer diagnoses and cancer death rates also have fallen in recent years. However, the collection of diseases known as cancer still kills 560,000 Americans annually, and there are increasing calls for revamping the entire research enterprise in order to achieve faster progress against them.

This issue was raised last May during the annual Forum on Science and Technology Policy in Washington, D.C., organized by the American Association for the Advancement of Science.

As quoted by Chemical and Engineering News, Christopher Hill, Ph.D., professor of public policy and technology at George Mason University, said the public’s inability to understand what scientists are doing and a growing frustration with the “failure” of science and technology to solve the world's major problems will “make the public less likely to remain convinced that expenditures on science and technology are an unalloyed good thing.”.

That frustration may result in a shifting of funds to “problem-solving research,” Hill predicted.

One reason basic biomedical research is such a hard sell is that innovations may take many years to come to fruition.

They are characterized by years and years of painstakingly derived results, before progress begins to accelerate, eventually leading to leaps in knowledge, and finally into live-saving treatments and cures. Plus, they are often difficult to instantly visualize. It’s much easier to spot innovative technology in cars, computers, and cell phones.

For example, the life scientists deliberately stayed in the background while the physical scientists initially—and successfully — pitched the National Nanotechnology Initiative to Congress as a potential boon to the microelectronics industry, “because,” says Cyrus Mody, Ph.D., an assistant professor at Rice University who teaches about the history of innovation and technology, “that’s an industry where talk about competitiveness is always on the table.”

The Human Genome Project similarly required a hard sell by scientists and a multi-billion dollar leap of faith by the government. Many politicians and academicians had doubts about the rationale for such a costly venture. Originally proposed as a way to study mutations caused by nuclear exposures, then cancer, and later genetic diseases, the Human Genome Project called upon a swarm of researchers from various disciplines to create a genetic map of the human chromosomes, identifying all the 20,000 to 25,000 genes in human DNA.

The project, completed in 2003, didn’t give scientists any answers to cancer or diabetes or birth defects—but it did offer them high-resolution navigation tools and clues for where they should begin looking.

Such a sell would be more difficult today, scientists admit, given the current state of the economy and the equally urgent challenges facing the nation’s health care delivery system. “In the public’s mind, they see that we’re 40th in health or wherever we are nationally,” Brown says. “I think the public conflates the mission of the NIH with other social issues and other health outcome issues that we need to fix as well.

“We need to be very specific about the positive outcomes of NIH dollars because they’re so far downstream with health outcomes and they may not understand that there are many steps in between.”

“Honestly, if we don’t talk about what we’re doing, and if we don’t sell what we’re doing and point out the benefits that accrue from it, we have nobody to blame but ourselves,” adds Ellen Wright Clayton, M.D., J.D., who directs the Vanderbilt Center for Biomedical Ethics and Society.

“We exist at the sufferance of the people, and so we’re accountable to them and we just have to tell the story,” Clayton says. “It’s a great story. But that’s what we have to tell them.”

Lisa A. DuBois and Nicole Garbarini contributed to this story