According to Library Journal, Google announced last week that it had formed “partnerships” with four states, Arizona, California, Utah, and Virginia, to offer index and search capabilities for public information in state government databases. Google’s Public Sector program seeks to make government information, much of it in the dark web of databases, more accessible through their SiteMaps protocol. A Sitemap is “an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently” (Wikipedia). I’m all for making govt information at all levels more findable to search engines, and SiteMap is an interesting way for Web masters to do that — sort of a MARC record for crawlers.
Another way to do that is for libraries to write/blog about their collections, their documents, the questions they answer and the resources they use to answer those questions. Libraries can also use web services like del.icio.us to collect and describe the Web sites that they use on a daily basis (see our tag cloud for an example). (On a side note, has anyone seen 50 matches, a search engine that only crawls web sites that were bookmarked or voted for by people, in sites like del.icio.us, digg and reddit?)
These will in effect release the information that libraries traditionally hold in closed systems and databases, make our collections (both digital AND physical!) more findable and vet the Web for our users. Got other ideas for “info-catharsis”? Let us know in the comments.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.