"By an overwhelming margin, technology experts and stakeholders participating in a survey fielded by the Pew Research Center’s Internet & American Life Project and Elon University’s Imagining the Internet Center believe that innovative forms of online cooperation could result in more efficient and responsive for-profit firms, non-profit organizations, and government agencies by the year 2020.... This publication is part of a Pew Research Center series that captures people’s expectations for the future of the Internet, in the process presenting a snapshot of current attitudes."
- The Impact of the Internet on Institutions in the Future, by Janna Anderson, Lee Rainie Pew Internet & American Life Project (Mar 31, 2010)
The State of Linked Data in 2010, by Richard MacManus, Read Write Web (March 31, 2010).
... Linked Data is data that has been marked up using Semantic Web technologies such as RDF (Resource Description Framework) or RDFa (a simpler variation). Minus the acronyms, Linked Data is simply structured data.
However one of the reasons the Semantic Web hasn't yet been widely adopted, at least commercially, is that it's often difficult or time consuming to mark up data semantically. RDF in particular has a reputation for being painful to code. With that in mind, the past year has been as much about prompting governments and organizations to put their data up on the Web in whatever form they can....
The most high profile usage of Linked Data over the past year has come from two governments: the United States and United Kingdom.
This week, the American Historical Association highlights the collection and selects some of their favorites:
- Government Comics Collection of UNL, By Elisabeth Grant, AHA Today (March 30, 2010).
NASA (the National Aeronautics and Space Administration), is archiving its tweets, YouTube videos, photos on Flickr, and Facebook discussions using Archive-it.
- Archiving NASA (Mar 5th, 2010) by NASA Images.
Have you ever wondered what will happen to all of NASA’s tweets, YouTube videos, photos on Flickr, or Facebook discussions? How will you find them years after they’ve been posted? What about the massive amounts of content published on nasa.gov everyday? Will it be accessible 50 years from now?
NASA Images has teamed up with Archive-It (also a service of The Internet Archive) to ensure that all of NASA’s online activity will be preserved for future research, curiosity, and enjoyment. We have started by archiving 54 of NASA’s Twitter streams. These 54 streams will be updated once a month, archiving every tweet from every stream. The next step is to archive nasa.gov, including all subdomains, and all of NASA’s social networking activity (YouTube, Facebook, Flickr, Ustream, MySpace). Take a look at the beginning of our conservation efforts in the NASA Images Social Networking collection on Archive-it.
- COLLECTION: NASA Social Networking. Archive-it. 2009 - present (Videos Archived: 13,054 videos).
- Archiving NASA’s social media. by Phil Plait Discovery Magazine Blog (March 21st, 2010).
See also: more NASA materials at Archive-it.
Hat tip: resource shelf!
Glenn Greenwald has just published a subtle article about a leaked CIA document and the increasingly aggressive war being waged on Wikileaks, the site that anonymously publishes leaked sensitive governmental, corporate, organizational, and religious documents.
The first part of the article deals with the leaked document, entitled "CIA Red Cell Special Memorandum; Afghanistan: Sustaining West European Support for the NATO-led Mission. Why Counting on Apathy Might Not Be Enough. (PDF)" (and also uploaded to the Internet Archive for the IAdeposit project). This hubristic document announces "Public Apathy Enables Leaders to Ignore Voters" and describes PR strategies for shoring up public support for the continued war in Afghanistan.
But the more interesting and disturbing part of Greenwald's story concerns Wikileaks. Greenwald interviewed Julian Assange, the Australian citizen who is WikiLeaks' Editor. The interview shed light on Wikileaks' work in exposing the secret activities of governments and corporations and also how the US and other governments are targeting Wikileaks as an enemy of the state and trying to destroy the organization -- for more see last week's NY Times article "Pentagon Sees a Threat From Online Muckrakers" and Wikileaks own editorial on the subject.
...In 2008, the U.S. Army Counterintelligence Center prepared a secret report -- obtained and posted by WikiLeaks -- devoted to this website and detailing, in a section entitled "Is it Free Speech or Illegal Speech?", ways it would seek to destroy the organization. It discusses the possibility that, for some governments, not merely contributing to WikiLeaks, but "even accessing the website itself is a crime," and outlines its proposal for WikiLeaks' destruction. Greenwald also points out the proposed law in Iceland which would provide meaningful whistle blower protection to groups like Wikileaks.
As the Pentagon report put it: "the governments of China, Israel, North Korea, Russia, Vietnam and Zimbabwe" have all sought to block access to or otherwise impede the operations of WikiLeaks, and the U.S. Government now joins that illustrious list of transparency-loving countries in targeting them.
...The need for independent leaks and whistle-blowing exposures is particularly acute now because, at exactly the same time that investigative journalism has collapsed, public and private efforts to manipulate public opinion have proliferated. This is exemplified by the type of public opinion management campaign detailed by the above-referenced CIA Report, the Pentagon's TV propaganda program exposed in 2008, and the ways in which private interests covertly pay and control supposedly "independent political commentators" to participate in our public debates and shape public opinion.
I highly recommend reading Greenwald's article. It's eye-opening on so many levels.
The war on WikiLeaks and why it matters. Glenn Greenwald. Salon.com. March 27, 2010.
From 1981 until 1998, Anne Heanue and the fine folks at the Washington Office of the American Library Association (ALA) published an amazing series called Less Access to Less Information by and about the U.S. Government, a chronology of efforts to restrict and privatize government information.
Readers may remember that the Internet Archive was kind enough to digitize the series from 1981 to 1996 for FGI, but that I had not been able to get my hands on 1997-98 issues. Well now, thanks to Bernadine Abbott Hoduski who sent me the 1997-98 volumes, the complete chronology from 1981 - 98 is now digitized and hosted at the Internet Archive.
Please check out the entire series of Less Access to Less Information by and about the U.S. Government available in the FGI library.
Many thanks again to Ginger Bisharat and Robert Miller at the Internet Archive for their effort. Also thanks to Bernadine Abbott Hoduski for sending me her copies of the series and Emily Sheketoff, Associate Executive Director of ALA and manager of the Washington Office who graciously gave me permission to digitize the series.
Secrecy News says there is a new report on the Foreign Relations of the United States (FRUS) series.
- IG: State Dept Should Produce 12 FRUS Volumes Per Year. by Steven Aftergood, Secrecy News (March 25th, 2010).
- Report of Inspection: The Bureau of Public Affairs, U.S. Department of State Office of Inspector General, February 2010, at pp. 34-38.
“The [State Department Historian's Office] is behind schedule in meeting the statutory FRUS deadline: HO historians only now are compiling the contents of the volumes covering the foreign policy of the Carter administration (1977-1981),” the Inspector General report said. “To achieve compliance with the 30-year deadline, HO will need to accelerate the rate of publication to approximately 12 volumes per year.”
First-ever National Study: Millions of People Rely on Library Computers for Employment, Health, and EducationSubmitted by jajacobs on Thu, 2010-03-25 06:43.
First-ever National Study: Millions of People Rely on Library Computers for Employment, Health, and Education, by Samantha Becker, Information School, University of Washington (March 22nd, 2010).
PORTLAND, Ore.—Nearly one-third of Americans age 14 or older – roughly 77 million people – used a public library computer or wireless network to access the Internet in the past year, according to a national report released today. In 2009, as the nation struggled through a recession, people relied on library technology to find work, apply for college, secure government benefits, learn about critical medical treatments, and connect with their communities.
The report, Opportunity for All: How the American Public Benefits from Internet Access at U.S. Libraries, is based on the first, large-scale study of who uses public computers and Internet access in public libraries, the ways library patrons use this free technology service, why they use it, and how it affects their lives. It was conducted by the University of Washington Information School and funded by the Bill & Melinda Gates Foundation and the Institute of Museum and Library Services.
Becker, Samantha, Michael D. Crandall, Karen E. Fisher, Bo Kinney, Carol Landry, and Anita Rocha. (2010). Opportunity for All: How the American Public Benefits from Internet Access at U.S. Libraries (PDF, 212 pages). (IMLS-2010-RES-01). Institute of Museum and Library Services. Washington, D.C.
I am not an expert on Digital Object Identifiers (DOI) or Handles or other methods of creating permanent, persistent links to information on the web, so I pose this as a question. Could DOIs help solve three problems that, if solved, would provide better preservation, better access, and a better user experience?
The three challenges are:
1. The need for reliable, permanent, persistent links.
2. The need to provide a simple user interface to depository collections.
3. The need to guarantee authenticity of government information.
Here is why I think the answer is Yes.
Problem: Providing reliable, permanent, persistent links. Currently, GPO uses PURLs (Persistent Uniform Resource Locators) for creating permanent links. PURLs provide "persistent" links so that, when a page moves and its URL changes, that change need only be recorded once -- in the PURL database -- and all the hundreds or thousand of links to the PURL resolve to the new address automatically without being changed themselves. While this is an efficient way to deal with the dynamic nature of web addresses and, while this system works, it is fragile. We had a graphic demonstration of that fragility last August when the GPO PURL server crashed. When that happened, no one anywhere in the world who relied on PURL links to the 115,000 PURLs pointing to government information could reach that information using those links for more than two weeks. This was not the fault of GPO (athough restoration time could have been reduced with better disaster planning). Rather, the very nature of PURLs makes them fragile in this way and vulnerable to the crash of a single server.
Solution: Persistence is a function of organizations, not of technology. DOIs address the fragility problem by building a social structure that guarantees persistence. As the DOI organization says, "Persistence is a function of organizations, not of technology; a persistent identifier system requires a persistent organization and defined processes. The International DOI Foundation (IDF) provides a federation of Registration Agencies (RA). Dependency on any one RA is removed." In other words, if one server crashes, others are available immediately. Rather than relying on a single organization (GPO) and a single server at that organization, DOIs rely on multiple Registration Agencies and multiple servers. DOIs are reliable because they use redundancy and have no single points of failure (Wikipedia).
Problem: Providing a simple user interface. Imagine with me for a moment a depository system that deposits digital documents in FDLP libraries. Once such a system is in place, we will have the same document in multiple locations -- perhaps one copy in GPO's Federal Digital System (FDsys), one copy in each of a dozen or more FDLP libraries, perhaps an "original" copy at house.gov or senate.gov, and so forth. What is the user to do? Will libraries show dozens of links with an explanation after each as to what it is and hope users will have the patience to read the explanations, make an informed decision, and, if that particular link is down, go back and repeat the process? This sounds like a lousy user experience to me.
Solution: Multiple redirections. DOIs provide a way to resolve multiple URLs with a single DOI. (Resolution of Multiple URLs). This would mean that multiple copies of digital documents could be stored at many separate FDLP libraries and all could use a single, clickable link (a DOI) that would get users the copy of that document based on criteria the library defines. For example, one library might have the DOI point to the original first and the local library copy second; another library might point to the "network-closest" copy first and then other more distant copies; and so forth. DOIs do this by storing metadata with the DOI. Rather than storing only the current URL of a registered item, DOIs can record a list of locations with hints for how the resolving client should select a location, including an ordered set of selection methods.
Here is an illustration of how it works:
This solution would have the added benefit of enabling and facilitating a true digital depository system in which digital information is deposited into FDLP libraries. FGI is a strong advocate of a depository system that does this for several reasons that we have described repeatedly here and in our writings and presentations. In brief, we believe that this would make it possible for individual FDLP libraries to build their own local digital collections focused on the needs of their own user communities; it would aid preservation by ensuring that multiple copies exist under different technical, financial, and administrative structures; and it would create a better user experience by providing a way to integrate digital FDLP/Title-44 documents with non-Title-44 federal documents, documents from state and local governments, and other non-government information. DOIs would, therefor facilitate preservation as well as access.
Problem: Guarantee Authenticity. How does a user know that the document they just retrieved is "authentic," that it has not been altered, that it really is what it purports to be? Many people hope for a technological solution (e.g., PKI, time stamps, encryption, digital signatures, watermarks). We at FGI believe that these are techniques that people use and that the authenticity comes, not from the technique, but from users' trust in the people who set up the techniques.
No one explained this better than Abby Smith (Digital Authenticity in Perspective in "Authenticity in a Digital Environment,” Council on Library and Information Resources, Publication 92. May 2000). She noted that, when technologists were asked about how to establish the authenticity of a digital object, they were skeptical of technological "solutions" and said that "there is no technological solution that does not itself involve the transfer of trust to a third party."
Solution: Trust is a social phenomenon, not a technical one. So, imagine how this might work. Imagine a document that is in FDsys, and in the digital collections of several FDLP libraries, and also at the New York Times, and at any number of other places on the web. There might be a dozen URLs for that one document. But, if GPO assigned a single DOI to it and made sure it pointed to FDsys and to "Official Depository Copies" at FDLs, that one DOI would, by definition, point to "authentic" copies -- the original and those officially deposited in Title-44-authorized Federal Depository Libraries. The "prefix" part of a DOI refers to the registering agency (in this case GPO) and would further help "brand" the DOI as authentic. Users wanting "authentic" government information would look for DOIs bearing the GPO prefix -- and they would find what they wanted with a single click, no matter where the particular copy they get is stored. (In addition, the DOI metadata can include authenticity information.)
Precedents. GPO would not be alone in using DOIs. Who uses DOIs? ICPSR, OECD, the European Communities' EU publications office, CrossRef, and many others.
Barriers. The main barrier I can see to adopting DOIs is cost. I assume that it will surely cost more than implementing PURLs. But the two costs cannot be compared directly because the costs buy different things. Implementing PURLs gets us a fragile redirection system. Implementing DOIs gets us a redirection system of persistent identifiers, the ability to have multiple redirects to multiple copies, and a new way of thinking about authenticity.
I welcome comments and responses to my question and particularly hope that those with more knowledge in this area will fill in the gaps I have left.
A recent survey of federal CIOs entitled "Transparency and Transformation Through Technology" (PDF) found that Federal agency chief information officers want to "concentrate on integrating information technology systems with business processes, not necessarily on transparency and performance management initiatives." Number one on their list of greatest value projects was "integrating systems and processes." In fifth place, the last place, was "transparency and performance management initiatives."
The survey asked more than 40 federal CIOs about which initiatives will provide greatest value and what they consider the greatest barriers to increased effectiveness. For more, see the Fierce Government IT news briefing:
Federal CIOs aren't sure Obama administration IT goals add value. By David Perera. March 24, 2010
[HT Jason Crawford!]
An interesting case study:
- Meikiu Lo and Leah M. Thomas. Creating an Institutional Repository for State Government Digital Publications. The Code4Lib Journal (22 Mar. 2010).
In 2008, the Library of Virginia (LVA) selected the digital asset management system DigiTool to host a centralized collection of digital state government publications. The Virginia state digital repository targets three primary user groups: state agencies, depository libraries and the general public. DigiTool’s ability to create depositor profiles for individual agencies to submit their publications, its integration with the Aleph ILS, and product support by ExLibris were primary factors in its selection. As a smaller institution, however, LVA lacked the internal resources to take full advantage of DigiTool’s full set of features. The process of cataloging a heterogenous collection of state documents also proved to be a challenge within DigiTool. This article takes a retrospective look at what worked, what did not, and what could have been done to improve the experience.
Library Hi Tech, Volume 28, issue 2 (2010) is a special issue on "Technology and digital preservation." Preprints of articles are now available, though subscription may be required to access some. A few that you may find interesting:
- Economics, sustainability, and the cooperative model in digital preservation, by Mr. Tyler O. Walters, Dr. Katherine Skinner. "The authors provide an examination of the emerging field of digital preservation and its economics. They consider in detail the cooperative model and the path it provides toward sustainability as well as how it fosters participation by cultural memory organizations and their administrators, who are concerned about what digital preservation will ultimately cost and who will pay."
- "Land of the lost": a discussion of what can be preserved through digital preservation, by Mr. David Pearson, Mr. Nicholas del Pozo, Mr. Andrew Stawowczyk Long. "...proposes the concept of preservation intent: a clear articulation of a commitment to preserve an object, the specific elements of that object that should be preserved, and a clear time line for the duration of preservation. It investigates these concepts through simple and practical examples."
- Keeping It Simple: The Alabama Digital Preservation Network (ADPNet), by Mr. Aaron Trehub, Mr. Thomas C. Wilson. "The purpose of this paper is to present a brief overview of the current state of Distributed Digital Preservation (DDP) networks in North America and to provide a detailed technical, administrative, and financial description of a working, self-supporting DDP network: the Alabama Digital Preservation Network (ADPNet). The authors view ADPNet in a comparative perspective with other Private LOCKSS Networks (PLNs) and argue that the Alabama model represents a promising approach to DDP for other states and consortia."
There is much more. A very rich and informative issue.
The Iraq Papers, Edited by John Ehrenberg, J. Patrice McSherry, Jose Ramon Sanchez and Caroleen Marji Sayej. Oxford University Press, 2009.
No foreign policy decision in recent history has had greater repercussions than President George W. Bush's decision to invade and occupy Iraq. It launched a new doctrine of preemptive war, mired the American military in an intractable armed conflict, disrupted world petroleum supplies, cost the United States hundreds of billions of dollars, and damaged or ended the lives of hundreds of thousands of Americans and Iraqis. Its impact on international politics and America's standing in the world remains incalculable.
The Iraq Papers offers a compelling documentary narrative and interpretation of this momentous conflict. With keen editing and incisive commentary, the book weaves together original documents that range from presidential addresses to redacted memos, carrying us from the ideology behind the invasion to negotiations for withdrawal. These papers trace the rise of the neoconservatives and reveal the role of strategic thinking about oil supplies. In moving to the planning for the war itself, the authors not only provide Congressional resolutions and speeches by President Bush, but internal security papers, Pentagon planning documents, the report of the Future of Iraq Project, and eloquent opposition statements by Senator Robert Byrd, other world governments, the Non-Aligned Movement, and the World Council of Churches. This collection addresses every aspect of the conflict, from the military's evolving counterinsurgency strategy to declarations by Iraqi resisters and political figures-from Coalition Provisional Authority orders to Donald Rumsfeld's dismissal of the insurgents as "dead-enders" and Iraqi discussions of state- and nationbuilding under the shadow of occupation. The economics of petroleum, the legal and ethical questions surrounding terrorism and torture, international agreements, the theory of the "unitary presidency," and the Bush administration's use of presidential signing statements all receive in-depth coverage.
The Iraq War has reshaped the domestic and international landscape. The Iraq Papers offers the authoritative one-volume source for understanding the conflict and its many repercussions.
Thanks and a tip of the FGI hat to Steven Aftergood who says that "Somewhat heavy-handedly, [the editors] offer their own interpretation of events involving the decisive influence of neo-conservatives, the unitary executive, and a U.S. drive to global hegemony, among other factors. Alternative explanations are not considered here."
Announcement of Workshop:
Providing Social Science Data Services: Strategies for Design and Operation
August 9-13, 2010
Ann Arbor Michigan
Chuck Humphrey, Head of the Data Library, University of Alberta
Jim Jacobs, Data Services Librarian Emeritus, University of California San Diego
This five-day workshop is being offered for individuals who manage or provide local support services for ICPSR and other numeric data for quantitative research.
Providing access to data has taken on greater prominence over this past decade with the emergence of several significant developments, including, e-Science infrastructure funding, the open data movement, national and institutional digital preservation strategies & services, data enclaves for confidential data, lifecycle data management planning, and data mash-up technologies on the Internet. Given these major environmental changes, how does one plan and design appropriate levels of data service in her or his local institution?
This workshop is structured around a five-stage data lifecycle model that focuses on data production, data dissemination, data repositories, data discovery and data repurposing. A day is dedicated to each stage in this model during which discussions address issues for local data services and computer exercises demonstrate service activities. In this context, fundamental data topics are covered, including understanding the data reference interview, working with variables, interpreting data documentation, coping with various dissemination formats, accessing different online services (e.g., SDA and Nesstar), searching for social science data, subsetting data using Web-based tools, selecting and downloading ICPSR data, and options for local data delivery. Throughout the workshop, an emphasis will be placed on social science concepts and terminology, as well as on practical solutions to service delivery.
Who Should Attend: Anyone who is new to providing services for numeric social science data or is seeking to revitalize an existing service. This is not a course in statistics and attendees are not expected to know how to analyze data.
Workshop will remain open only until the Summer Program office has received 20 paid applications.
If you have questions about registration, fees, travel, housing, or other courses at the ICPSR Summer program, please get in touch with ICPSR directly:
If you have any questions about the workshop content, please feel free
to send email to Chuck or Jim:
Chuck: humphrey at datalib.library.ualberta.ca
Jim: jajacobs at ucsd.edu
Dates: August 9-13, 2010
Location: University of Michigan, Ann Arbor MI.
Fees (Participants from ICPSR member institutions): $1,500
Fees (Participants from institutions that are not members of ICPSR): $3,000
List of ICPSR member institutions and Official Representatives:
Information about transportation and housing:
This workshop is part of the ICPSR Summer Program in Quantitative Methods of Social Research
James A. Jacobs
jajacobs at ucsd.edu
ProPublica shows all the changes in the Health Care bill side by side. A very useful tool!
- Side by Side: Health Care Bills, (Senate-Passed Version (Dec. 24, 2009) With Reconciliation Changes (March 18, 2010)). by Olga Pierce and Jeff Larson, ProPublica
"we've created a side-by-side comparison of the full versions of the Senate healthcare bill versus the bill that will likely go before the House for a vote on Sunday. What you've seen elsewhere-the text put out by the House Rules Committee-is just a 150-page list of amendments to the Senate bill ("strike paragraph 4", "insert this new sentence in paragraph B:…"). What we've created, the full proposed final bill-and highlights of the changes- allows you to easily to compare House's Reconciliation proposal to the earlier Senate bill."
The Supreme Court has updated its much-maligned, old-fashioned website (See the old website at the Wayback Machine). It has also changed its URL from supremecourtus.gov to supremecourt.gov
- Changes for Court’s website, Lyle Denniston SCOTUS Blog (March 18th, 2010).
The Supreme Court has now assumed management of its own website, retrieving it from the Government Printing Office.
- press release, Supreme Court, Office of Public Information (March 18th, 2010).
The Sunlight Foundation, which had suggested changes to the site last year, has some useful comments and suggestions for further change and links to more information:
- Supreme Court unveils new website: how does it look?, By Daniel Schuman, Sunlight Foundation Blog (03/18/10).
Citizens across the country have received the Census questionnaire this week.
PLEASE FILL IT OUT IMMEDIATELY.
Remember, the Census ensures that your community gets the federal funding it needs. Funding for things like education, transportation, hospitals, and more!
Your information is kept completely confidential for 72 years. Then it becomes available for the public--which means eventually your great-great-grandkids can research where you were in 2010.
So fill it out...send it in...because we can't move forward until you send it back!
The Sunlight Foundation announced today a new bill introduced by Congressman Steve Israel (NY-2) called the Public online Information Act (POIA) (read the bill (PDF)). POIA will require that all "public" executive branch documents be permanently available on the Internet at no cost. POIA also creates a:
"special federal advisory committee to coordinate the development of Internet disclosure policies. These policies promote information best practices, including data interoperability standards, and will keep the government up-to-date with new technology. The advisory committee’s 19 members – six appointed by each branch of government, plus one by GSA – are drawn from the public and private sectors and serve as watchdogs, synthesizing the needs of agencies and the public and making recommendations on updating federal law."
While I wholeheartedly support the spirit of POIA -- free permanent internet access to executive branch documents! -- and will definitely be contacting my representative to support its passage, I have 2 concerns that I hope will be discussed by the Sunlight community, the soon-to-be federal advisory committee, libraries and the public:
1) preservation: There was an article in today's NY Times -- "Fending Off Digital Decay, Bit by Bit" -- that highlights the many issues surrounding digital preservation. Just putting something on the Web does not mean that it will be preserved. The GPO has been working on their Federal Digital System (FDsys) since 2004 (and really since 1994 when they started GPOaccess) to deal with the inherent digital issues. Many researchers, librarians, academics, computer programmers etc have been working on these issues pretty much since the 1960s. And the issues are still here today.
So I'd like to see as part of this bill an acknowledgement that online information is expensive to preserve AND that there will be continued funding for research and sustainability of digital archives through the National Digital Information Infrastructure & Preservation Program (NDIIPP). Readers are encouraged to explore the issues here and here.
2) privatization of govt information: The following from the Sunlight announcement caught my eye and concerned me:
Freeing government information from its paper silos provides the private sector with raw material to develop new products and services and gives the public what they need to participate in government as active and informed citizens.
Federal government information is in the public domain. That's a good thing. However, there's a fundamental issue at stake here. One can't have "permanent free public access" to government information where the private sector is involved. The private sector has been involved in giving access to government information for a long time (see LexisNexis, Thomson West, Readex etc). They do it well but they certainly don't do it for free. Libraries and other organizations have paid many millions of dollars to license access to govt information for the communities they serve. Here's more background and context on privatization. For all intents and purposes, these private sector companies take public domain information and privatize it. Any digital govt information accessible on the internet should already be findable, usable and accessible in bulk at minimum.
But there needs to be more. What I'd like to see in this bill and in the discussion after it passes (devil's in the details right?!) is not only a requirement that all govt information is online permanently and for free, but that there be the inclusion of a viral GNU General Public License-like piece of the public domain whereby anything IN the public domain (i.e., govt information) has to STAY IN the public domain. There are plenty of folks (I'm looking at you Sunlight, Govtrack.us, OpenCongress, OpenCRS etc) excited about making govt information more available, more usable and more shareable and this would support their public service.
C-SPAN has posted their archives online. That's 23 years worth, 160,000 hours - online (almost all of their content). This is extremely cool. Get ready to waste a chunk of time today going through their archive. It should be noted that while all their programming is available, popular programs like Book TV are not embeddable (although you CAN send the link to facebook, twitter etc). Go ahead and browse the committee list for a little vicarious legislating :-)
The C-SPAN Archives records, indexes, and archives all C-SPAN programming for historical, educational, research, and archival uses. Every C-SPAN program aired since 1987, now totaling over 157,000 hours, is contained in the C-SPAN Archives and immediately accessible through the database and electronic archival systems developed and maintained by the C-SPAN Archives.
[HT to Paul Blumenthal (@PaulBlu) at Sunlight Foundation!]
2009 Fall DLC Meeting: "Demystifying Digital Deposit: What It Is and What It Could Do for the Future of the FDLP"Submitted by rtroyhorton on Tue, 2010-03-16 08:04.
At the Fall 2009 Depository Library Council (DLC) meeting in Arlington, VA, James A. Jacobs and I (Rebecca Blakeley) introduced attendees to the concept of "digital deposit" that maps out the pieces of the FDLP cloud and what it could do for the future of the FDLP. Our slides and notes are available for you to view and download online.
2005 Nevada Library Association Annual Conference: "Who's government information? Our government information."Submitted by jrjacobs on Mon, 2010-03-15 10:09.
At iConference 2010 at the University of Illinois at Urbana-Champaign I organized and participated on the panel, “Future of Government Information” with Tom Bruce (Cornell Legal Information Institute), Daniel Schuman (Sunlight Foundation) and Cindy Etkin (Government Printing Office (GPO)). My slides and notes are online.
Below are a list of places and conferences to which FGI volunteers have been invited to speak about the future of government information.
Calling all 21st century librarians: the fine folks at Citability and the League of Technical Voters Project are organizing a weekend code-a-thon in Washington DC April 9th - 11th. The goal is to create open source tools aimed at improving government accessibility and accountability. But you don't have to be a coder to participate. They're also looking for librarians! Now's your chance to put your govt information skills toward an amazing project.
If you live in Washington DC area, please Sign up for the DC Code-a-thon today Join with lots of smart people working hard and having fun for the great cause of govt transparency!
[UPDATE: Scroll down for list of library happenings for Sunshine Week]
Spring has sprung with a vengeance here in SF. And that could only mean one thing: Sunshine Week!! Yes it's time once again to feel the warm FOIA on your cheek, to discuss and raise awareness of the importance of free and open government information, transparency and the Freedom of Information Act. Be on the lookout for editorials in your local newspaper (like this one in the Cleveland Plain Dealer), discuss FOIA with your friends and family (you'll be glad you did :-)) and highlight it in your libraries -- perhaps by having a public showing of the OpenTheGovernment Webcast!
OpenTheGovernment.org is having a Sunshine Week Webcast 12-2PM EST on Friday March 19 entitled "Building Transparency." The Webcast will include a host of great speakers including Norm Eisen, Special Counsel to the President for Ethics and Government Reform, Jim Harper, Director of Information Policy Studies at the Cato Institute, John Wonderlich, Policy Director at the Sunlight Foundation, Kevin Goldberg, American Society of News Editors (ASNE) counsel, Miriam Nisbet, Director of the new Office of Government Information Services (OGIS), Melanie Sloan, Executive Director, Citizens for Responsibility and Ethics in Washington (CREW), Melanie Pustay, Director of the Department of Justice (DOJ) Office of Information Policy (OIP), Eric Gundersen, President and co-founder of Development Seed and Sean Moulton, Director of Federal Information Policy at OMB Watch. It should be a great discussion so hope you can tune in.
What libraries and others are doing for Sunshine Week:
- Northern CA Association of Law Libraries (NOCALL), in association with the Special Library Association Sierra Nevada Chapter, is sponsoring 2 Sunshine Week events; one in Sacramento and one in San Francisco. Both have interesting lists of speakers and require registration for a small fee ($20 for Sacramento event and $15 for SF event). In addition, the SF event immediately precedes the NOCALL Spring Institute on information piracy, "Piracy on the Barbary Coast" which NOCALL and SLA members can attend at the NOCALL member rate, and later in the evening, a celebration of NOCALL's 30th anniversary.
- Freedom of Information Day at the New York Public Library. Tuesday, March 16, 2010, 10:30 - noon. Conference Room 18 on the lower level of New York Public Library (188 Madison Ave. @ 34th St.).
This year's guest speaker is Heather Joseph, Executive Director, the Scholarly Publishing and Academic Resources Coalition, (SPARC), an international alliance of academic and research libraries working to create a more open system of scholarly communications. FOIA day has been held at NYPL annually since 1993.
- California State University San Bernardino Pfau Library has partnered with the San Bernardino League of Women Voters to be a site for the OpenTheGovernment.org webinar on government transparency. This is the second year that Pfau Library has participated. You can see video of last year.
- The web site www.TalkStandards.com will focus on open government during its monthly online forum. The forum will take place on Thursday March 18th from 8-12 Pacific / 11-3 EST / 4-8pm GMT.
TalkStandards is an active online community where ICT developers, researchers, policymakers and other interested parties can share ideas and collaborate on the global standards system. Each month, a timely topic is chosen (last month, it was eHealth, for example).