The National Academies committee tasked with exploring how to move toward an open science enterprise held a symposium on Sept. 18 where a range of stakeholders discussed how to expand open access to scholarly publications, improve standards and best practices for sharing data, and better manage data repositories.
On Sept. 18, the National Academies study committee “Toward an Open Science Enterprise” convened a symposium at which a range of scientific community stakeholders offered their perspectives on how best achieve the committee’s goals. Topics addressed included how to expand open access to scholarly publications, improve standards and best practices for sharing data, and better manage data repositories. Slide presentations from the meeting are available here.
Publishers taking a variety of approaches to enhance open access
At the symposium, representatives from several publishing companies emphasized their ongoing commitment to expanding access to research while maintaining sustainable business models.
Holly Falk-Krzesinski, vice president of global strategic networks at Elsevier, discussed the differences between various open access models being used in scientific publishing. In “gold” open access journals, all content is available for free online immediately after publication, and authors or institutional funders of their research must pay an article processing charge (APC). “Green,” also sometimes called “delayed gold,” open access allows articles to be released publicly in their final form but only after a specified embargo period. Journals may also permit authors to make articles freely available either on preprint servers prior to publication or as manuscript versions on the author’s website or in institutional archives. Michael Forster, managing director of IEEE Publications, described how some non-open access IEEE journals also give authors a “hybrid” option of paying an APC to make articles immediately available.
Several speakers expressed concern that APCs could become a significant financial burden on researchers and institutions. Ivy Anderson, director of collection development and management for the California Digital Library, and Tyler Walters, dean of university libraries at Virginia Tech, suggested that universities could designate or contribute funds to help cover the APCs. Jennifer Hansen, senior officer at the Bill and Melinda Gates Foundation, noted that her organization covers all costs of open access charges, and requires all articles from funded studies to be open access immediately upon publication.
Federal agencies typically require publications resulting from research grants they sponsored to be open access after a specified embargo period. Howard Ratner, executive director of the Clearinghouse for the Open Research of the United States (CHORUS), explained that his organization works with the agencies to facilitate open access to publications within the existing infrastructure for scholarly communication. Ratner noted that CHORUS has developed partnerships agency-by-agency and it has aimed to ease the administrative burden of researcher compliance with open access requirements.
Joerg Heber, editor-in-chief of the Public Library of Science (PLOS) ONE and John Inglis, executive director of Cold Spring Harbor Laboratory Press and co-founder of bioRxiv, highlighted the importance of preprint servers as a way for researchers to share and evaluate data and findings while the lengthy peer review and publishing process is underway. However, Inglis indicated that there is still some anxiety in the research community over preprint results being “scooped” by other researchers without attribution.
Inglis also said that publishers are becoming more accepting of preprint servers, observing that the tool is “gaining momentum” across scientific disciplines. The American Geophysical Union (AGU) recently announced it is developing its own preprint server for Earth and space sciences in partnership with publishing software company Atypon, and will accept papers that have been published on preprint servers.
Adoption of open data sharing standards raised as key concern
Several publishers emphasized the importance of adopting community guidelines that would require or encourage researchers to make the data underlying publications available whenever possible. Although publishers have made exceptions to open data requirements for data related to national security or patient privacy, for example, other publishers have been hesitant to adopt overarching data policies. Both Kresinski and Forster expressed concern about any potential implementation of “one-size-fits-all solutions.”
Kenton McHenry, technical coordinator of the National Data Service Consortium, and Daniel Goroff, a vice president at the Alfred P. Sloan Foundation, indicated that uncertainty around data security is a serious concern that erodes trust in repositories. Goroff suggested that repositories could address the issue by limiting access to data and creating a network to share standards and best practices for generating data.
Another significant challenge highlighted in many presentations is the need for common guidelines and adherence to the “FAIR principles” — standards for making data findable, accessible, interoperable, and reusable. Shelly Stall, assistant director of AGU’s data management assessment program, highlighted a recent survey that indicated a lack of data and exchange standards as one of the top challenges for data use.
Stall explained that publishers and data repositories are working to meet the FAIR principles through actions such as adopting the Center for Open Science’s TOP Guidelines, signing the Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) Statement of Commitment, and endorsing the Joint Declaration of Data Citation Principles. She reported that AGU has recently been awarded a new grant from the Laura and John Arnold Foundation (which is also sponsoring the National Academies committee) to develop best practices and standards to enable FAIR data in the Earth and space sciences.
Community grappling with data repository challenges
Kerstin Lehnert, director of the Interdisciplinary Earth Data Alliance at Columbia University, stressed the importance of creating domain or discipline-specific data facilities, as they tend to be more trusted by researchers. She said that budgets supporting repositories compete with core science budgets all while having to continuously update requirements and technologies. She emphasized that the long-term sustainability of these repositories must be addressed.
Lehnert also emphasized that partnerships among data facilities and their users “are essential to make protocols and policies more effective and the landscape manageable for all stakeholders.” As examples, Lehnert and Stall highlighted COPDESS and the National Science Foundation’s EarthCube as two successful collaborations between data facilities and publishers in the Earth and space sciences.
Representatives from federal science agencies, including U.S. Geological Survey and the National Institutes of Health, explained that their data management plans are already addressing the need for data repositories that are trusted and comply with open access standards. Agencies have developed these plans in accord with a 2013 White House Office of Science and Technology Policy memo that required agencies with R&D expenditures of more than $100 million to do so.
There is no certainty as to which points discussed in the meeting will be considered in the committee’s final report. However, it is becoming apparent that many stakeholders are eager to stave off “one-size-fits-all solutions” for moving the research enterprise toward open science. The committee will continue their discussion later this year.