Home » post » NSF understands what GPO doesn’t seem to

NSF understands what GPO doesn’t seem to

The National Science Foundation (NSF) has been active in technology and Internet efforts for many years. I think it’s fair to consider most NSF people technologists. So, it was with pleasant surprise that libraries and the concept of geographically distributed local digital collections got several favorable mentions in the new NSF Cyberinfrastructure Vision and Strategic Plan.

Although the plan doesn’t explicitly deal with government information, I think it does have something to say to the depository community and to the Government Printing Office which appears to be favoring a centralized model of information dissemination.

First the plan explains the advantages libraries held in preserving scientific output in print (emphasis mine):

In print form, the preservation process is handled through a system of libraries and other repositories throughout the country and around the globe. Two features of this print-based system make it robust. First, the diversity of business models deriving support from a variety of sources means that no single entity bears sole responsibility for preservation, and the system is resilient to changes in any particular sector. Second, there is overlap in the collections, and redundancy of content reduces the potential for catastrophic loss of information. – Page 19

Rather than try to consign this robust system as a “legacy collection” of scientific information, the NSF has seen the future and has found it to be geographically and institutionally diverse (emphasis mine):

The national data framework is envisioned to provide an equally robust and diverse system for digital data management and access. It will: promote interoperability between data collections supported and managed by a range of organizations and organization types; provide for appropriate protection and reliable long-term preservation of digital data; deliver computational performance, data reliability and movement through shared tools, technologies and services; and accommodate individual community preferences. The agency will also develop a suite of coherent data policies that emphasize open access and effective organization and management of digital data, while respecting the data needs and requirements within science and engineering domains. – Page 20

Where does NSF expect to find the expertise and willingness to build these diverse digital collections? In part, in libraries! (Emphasis mine):

At the institutional level, colleges and universities are developing approaches to digital data archiving, curation, and analysis. They are sharing best practices to develop digital libraries that collect, preserve, index and share research and education material produced by faculty and other individuals within their organizations. The technological implementations of these systems are often open-source and support interoperability among their adopters. University-based research libraries and research librarians are positioned to make significant contributions in this area, where standard mechanisms for access and maintenance of scientific digital data may be derived from existing library standards developed for print material. These efforts are particularly important to NSF as the agency considers the implications of not just making all data generated with NSF funding broadly accessible, but of also promoting the responsible organization and management of these data such that they are widely usable.

The moral of this story for the depository library community is simple. If the Uber geeks at NSF appreciate the contributions of libraries and the logic of local digital collections, maybe we should too.

Leave a comment

Your email address will not be published. Required fields are marked *


%d bloggers like this: