Science Committee Turns Attention to NSF Grant Strategy, Research Standards
On March 21, the House Science Committee’s Subcommittee on Research and Technology held the second of two hearings this month devoted to the National Science Foundation. Billed as a look at the “future opportunities and challenges for science,” it concentrated on how NSF organizes its grant-making across research directorates and on the virtues and limits of transparency in scientific work.
Rep. Barbara Comstock (R-VA), who chairs the subcommittee, said the hearing would not address NSF’s budget. However, referring indirectly to the Trump administration’s proposed cuts to science programs at other agencies, she did say, “I can say for myself and probably a few others here, we are very interested in maintaining that budget.”
Witnesses discuss future of cross-directorate grants
A question that attracted considerable attention at the hearing is whether NSF’s organizational structure sufficiently accommodates cross-directorate projects.
Keith Yamamoto, vice chancellor for science policy and strategy at the University of California, San Francisco, testified that NSF should encourage the development of “transdisciplinary science,” which he characterized as “virtually a merger of the physical and natural sciences, engineering, and computation.” He suggested NSF should create a “new organizational layer that floats above the directorates and is sectored into ‘big idea’ or ‘big challenge’ research programs.”
NSF Acting Chief Operating Officer Joan Ferrini-Mundy testified that work across directorate boundaries has already become “just as prevalent” as projects funded solely by a particular directorate. She pointed specifically to the Innovation at the Nexus of Food, Energy, and Water Systems program; the BRAIN Initiative; the Risk and Resilience activity; and the Research Advanced by Interdisciplinary Science and Engineering (RAISE) proposal mechanism.
Replying to a question about how to foster cross-directorate research, National Science Board Chair Maria Zuber warned that Congress should avoid creating new research silos, adding, “By specifying funding for directorates, that of course creates silos.”
The Science Committee majority has already declared this year that it will renew its push to require NSF’s budget appropriations to specify directorate-level funding and to require that 70 percent of NSF’s research funding be split between its Mathematical and Physical Sciences, Engineering, Biological Sciences, Computer and Information Sciences, and Engineering directorates.
Hearing explores potential and limits of research transparency
Jeffrey Spies, the chief technology officer of the non-profit Center for Open Science in Charlottesville, Va., testified that NSF should incentivize increased openness in scientific research. He argued that journal publications stress “novel results and clean narratives,” but do not provide enough detail for others to replicate those publications’ conclusions. He argued that scientists should make their materials, methods, data, software, and analyses more broadly available. Easier reproduction, he urged, would bolster scientific credibility.
Rep. Dan Lipinski (D-IL), the subcommittee’s ranking member, observed that a criticism of proposed scientific transparency requirements in regulation is that much data cannot be made available. Spies agreed that there are cases where one cannot be “completely open,” and suggested remedies such as making data privately available for inspection or releasing methods and materials but not data.
Comstock asked Spies whether increased openness would make scientists’ work subject to “punishing” scrutiny. Spies replied that scientific culture should be more accepting of mistakes and that it would be beneficial for work to be peer reviewed prior to its enshrinement in publication. Zuber countered that it is preferable for scientists to correct their mistakes themselves, and that the scientific system should incentivize research groups to do so.
Late in the hearing, Yamamoto pointed out that reproducibility is not a straightforward standard of scientific quality. He noted that, when confronting complex problems, differences in hidden variables can lead equally valid studies to different results. He said,
It calls into question, in a way, attempts to fund studies to simply try to reproduce complex results. Understanding robustness is critical, but being able to label something as right or wrong based on whether it’s reproducible I think is problematic.
Reproducibility a focus of Smith’s statement
This month’s hearings on NSF were part of the committee’s preparations to introduce major, new authorization legislation for the agency. However, the main architect of that effort, Committee Chair Lamar Smith (R-TX), did not attend the second hearing because he was at the White House for the signing of the NASA Transition Authorization Act.
In his submitted written statement , though, Smith singled out research reproducibility as an issue of special interest to him, saying,
Reproducibility addresses and can prevent fraud and poorly designed and executed research. Unfortunately, there is evidence of the increasing frequency of non-reproducible experiments, particularly in certain fields of science.
What actions the committee might consider relating to reproducibility in NSF-funded research remains unknown.
Scientific standards a recurring theme in committee’s work
What is certain is that reproducibility, and scientific methodology more generally, have emerged this year as an important theme in the committee’s work.
The committee’s “HONEST Act,” which is scheduled for a House vote this week, requires that the Environmental Protection Agency base all new regulation on studies that are reproducible and have made their data publicly available. Discussing the legislation at a Heartland Institute event last week, Smith said it will “guarantee” that EPA will base its regulations on “legitimate science” and prevent the agency from “promoting a one-sided ideological agenda.”
In the same address, Smith also previewed a hearing that the full Science Committee has scheduled for March 29, saying, “It’s going to be on our favorite subject of climate change and also the scientific method, which is repeatedly ignored by the so-called, self-professed climate scientists.” While Smith did not cite the reproducibility issue specifically, he made clear that he regards much of climate research as methodologically unsound, saying it is more appropriate to call it “climate study” than “climate science.”
The problem of research reproducibility is currently attracting widespread interest and discussion within the scientific community. It was, notably, the subject of the latest National Academy of Sciences Arthur M. Sackler Colloquium, held on March 8. And, in her opening statement at the second NSF hearing, Rep. Eddie Bernice Johnson (D-TX), the committee’s ranking member, called reproducibility “a well-documented challenge across all STEM fields, and one for which this committee can help promote progress.”
At the same time, concerns are mounting that the issue is turning into a political weapon. Critics argue that strictly defining scientific corroboration in terms of reproduction unduly limits what constitutes legitimate science. Johnson moreover noted that there is also a tendency to conflate the reproducibility challenge with research misconduct, adding that, “I often hear the rare cases of misconduct being used as a sledgehammer to impugn scientists broadly.”