It’s been a quiet month on the blog, but April is an important month at CSUPERB so we need to celebrate!
We announced the CSUPERB “major grant” awards and the Presidents’ Commission Scholars this week. The Faculty-Student Collaborative Research Grants and the Presidents’ Commission Scholars are two of the most popular CSUPERB programs, as gauged by campus participation. So our normally quiet office enjoyed the email buzz from students, PIs, chairs and deans this week!
Campus participation defined by applications received from each campus to CSUPERB grant program, award program or as symposium registration. Data shown for AY06/07 – AY12/13.
CSUPERB made 36 grant awards totaling $574,685 to CSU faculty at 17 CSU universities. Awards were made as part of four competitive CSUPERB grant programs: New Investigator, Research Development, Entrepreneurial Joint Venture and Programmatic Development. Faculty review panels evaluated 95 proposals from principal investigators (PIs) at 19 different CSU campuses. Averaged across the four programs, awards were made to 38% of the proposals received.
I use the scare quotes around “major grants” because these are the largest awards CSUPERB makes, but they are all seed grants that pay out $15,000 – 25,000 spent over 18 months. The aim of these programs is to support preliminary work that can lead to follow-on funding from external agencies and organizations. These follow-on grants support collaborative faculty-student research, innovative educational programs, and knowledge and technology transfer. The reality of biotechnology-related scholarship is that significant funds (>$15k/year) are needed to support research programs. Students gain deep learning opportunities working with PIs or participating in courses that are built on faculty scholarship. As a consequence grant-getting is fundamental to biotechnology education and research. We wish all our new PIs the best of luck in the lab, field and clinic!
Sixteen undergraduate researchers, the 2013 Presidents’ Commission Scholars, will be carrying out faculty‐mentored biotechnology research projects on 12 different CSU campuses this summer. CSUPERB provides $8000 to support these summer research projects. This year’s request for proposals invited applications from CSU students early in their academic career. The majority of applications were still from students in or starting their junior (3rd) year, but the selection committee funded freshman and sophomores as well. Jaimey Homen, a chemistry student finishing her first year at Sonoma State University, will be working with Dr. Carmen Works to characterize photochemically activated molecules. The group’s long-term goal is to engineer molecules that deliver carbon monoxide (CO) to specifically protect certain biological tissues. For context, CO has been shown previously to improve organ transplant survival rates. Ms. Homen became interested in undergraduate research opportunities and met Dr. Works by participating in SSU’s Freshman Learning Community. We hope Ms. Homen and the other 2013 Scholars have a wonderful summer!
CSUPERB’s peer review process starts in February when proposals are received. This spring 57 faculty from 20 CSU campuses worked on six different proposal review panels. The major grants were reviewed at meetings April 13-14 in San Jose; four different panels discussed and evaluated proposals that weekend. The travel grants and Presidents’ Commission Scholar applications are reviewed by panels working on the internet and by teleconference. Overall our faculty reviewers do a great job selecting promising research projects to fund. For every major grant dollar awarded by CSUPERB between 2004 and 2010, PIs went on to win $14 (a 1400% fiscal “return on investment”) in grants from external organizations. This, of course, is a direct credit to the excellent and competitive faculty scholars at work in the CSU.
We celebrate and justify our grant programs by pointing to the fiscal return-on-investment, but we also monitor student impact and knowledge transfer (publications, collaborations). But any measure of peer review “success” must come with an acceptance of failure as well. Not all the engineered strains survive, not all the experiments work, not all the hypotheses pan out. Not all the PIs write well-crafted follow-on grant proposals, not all the research collaborations hold together, not all the innovative ideas find a good fit at a funding agency or an angel investing group. Some ideas are ahead of their time, some skate too close to the bleeding edge, some are out of step with prevailing opinions. We teach our students and assistant professors that their success will depend on their ability to shake off failure and move on to write the next draft, design the next experiment, or repeat the test until it’s significant. Some of those successes will come within the year, but scientific triumphs often take longer than we expect or come later in a career than hoped.
Expert scientists, engineers and clinicians are familiar and comfortable with these truths. None of us can predict the research projects that will work or have the greatest impact on society. But if we don’t talk about the failures inherent in scientific research and development, unintended and “disastrous”* consequences result.
Scientific peer review came under increased congressional scrutiny this week.** Rep. Lamar Smith challenged the National Science Foundation (NSF) peer review processes and proposed new review criteria. Rep. Smith went on to request access to the “scientific/technical reviews and Program Officers Review Analysis” for five specific NSF grants. Yesterday President Obama defended scientific peer review during a talk at the National Academy of Science, stating, “I will keep working to make sure that our scientific research does not fall victim to political maneuvers or agendas that in some ways would impact on the integrity of the scientific process.”
Faculty reviewers and PIs probably don’t think often enough on the integrity underlying our peer review systems. More often we grumble about nit-picking reviewers, the lack of high-risk, high-impact ideas, program officers’ insistence on well-written, on-time reviews, and the dearth of funds needed to support biotechnology innovation. But if we sit back and ponder the implications of Rep. Smith’s requests to NSF, we suddenly see the wonder and power of our grass-roots, peer-driven national science agenda. This is a process that serves to select the best science as-we-see-it, to plant the seeds of new technologies and therapies, and to train generations of the nation’s best-and-brightest scientists, engineers and clinicians. The U.S. peer review systems underlying our research and development enterprise aren’t always pretty or perfect or innovative, but like our democracy, they’re highly regarded worldwide despite inherent incrementalism and consensus-building. The corollary is that the aggregate outcome of peer review is the aggregate outcome*** of our nation’s research enterprise that remains envied worldwide.
Can we improve the system? Sure. Even at CSUPERB we evaluate our programs, iterate our processes, and tune the strategic intent of our grant programs. We do that with significant input from the expert science and engineering faculty involved with the program. We adjust to the budgets supplied by the taxpayers via the California legislature and the governor. We keep our eyes on how biotechnology is defined by the external life science community. But – as of yet – we have not had to change how and what biotechnology research we fund in response to political pressure of any kind.
I understand the politicians in Washington, D.C. hold the purse-strings, but I sincerely hope political committees will not dictate how and what American science is done going forward. To go that unscientific and undemocratic route would, indeed, be disastrous to our research and development enterprise.
* Characterization attributed to Bruce Alberts at Nature Blogs.
**The blogosphere is just getting heated up about this political power-grab of peer review, but some good context is provided by Derek Lowe and The AmericanScience bloggers.
***U.S. research outcomes can be reported many different ways, for example, see NSF’s measures and outcomes and Ben Bernanke’s take.