FOIAmachine will help journalists, researchers and the public submit FOIA requests, track their progress through the Federal government bureaucracy, and post documents from successful FOIA requests online for public access. While they've made their original goal, they're looking to stretch it in order to build out additional features and host open records training in 5 states. I'm a backer and you should be too. You've got 3 days to help CIR meet their goal and help make FOIAmachine a reality.
Video from NPR Digital Services on the value of government information resources in journalism. Provides examples of data driven stories, discusses where to find data and how to effectively use data without making "rookie mistakes." Also contains information on using the Freedom of Information Act (FOIA) to get data not currently accessible.
If you watch till the end, you'll see a mention of the usefulness of the State Agency Databases project at about 52:00.
There are a lot of parallels between journalism and librarianship and between newspapers and libraries in the digital age. In a recent article, one journalist has suggestions for journalists that, I believe, have analogies for librarians. One useful idea: the need for mentors (with lots of experience) for the new generation of librarians.
- Why we need to separate our stories from our storytelling tools, By David Skok, Nieman Journalism Lab (Sept 28, 2011).
In the digital world, the tools we use to tell the world's stories -- Twitter, Google, Facebook -- control us as much as we control them. I am a digital journalist, and I’m enthusiastic about what our new platforms can provide us in terms of telling stories. But I also wonder whether we’re letting our tools define, rather than serve, the stories we tell.
...Twitter, Google, and Facebook -- to take the most prominent examples -- are wonderful tools that open up a whole new universe of communication, interaction, and reporting. But that's all that they are: tools. And they are tools, of course, that are provided by profit-driven companies whose interest lies as much in their own benefit as our own.
...And the onus is on digital journalists to welcome veteran reporters into the future’s fold -- to help them navigate the new tools that will inform, if not define, the shape journalism takes going forward.
But the onus is also on digital journalists to learn from the veterans -- to learn reporting methods and narrative techniques and skills that have nothing to do with Google or Facebook or Twitter, and everything to do with journalism as it's been practiced throughout its history. The veterans may not be able to show you how to create Fusion tables, but I can promise that, from them, you'll learn something new that will help your reporting more than the latest tools ever could.
As a companion piece on a different, but related, subject I like this article from the new blog at the Chronicle
- Curate for What Ails Ya, By Ben Yagoda, Chronicle of Higher Education Lingua Franca blog (September 28, 2011).
[The web] has developed in a such a way that raw data are sorted and organized not by human hands but by algorithms (number of page views, number of thumbs-up, Google's secret sauce, Wikipedia's universal access and veto power) that are certainly democratic and often useful, but just as often bring in too much noise and too much funk.
Curating the word and curating the phenomenon suggest a welcome recognition that some situations demand expert taste and judgment.
Beyond Books: News, Literacy, Democracy and Americas Libraries (BiblioNews) is a one-and-one-half-day convening April 6-7, 2011 at MIT in Cambridge, Mass., for journalists, librarians and citizens.
This looks to be a really interesting conference! Please let us know if you are going and can blog about the event here. (Send mail to freegovinfo at gmail dot com )
For three centuries, in American towns large and small, two institutions have uniquely marked a commitment to participatory democracy, learning and open inquiry -- our libraries and our free press. Today, as their tools change, their common missions of civic engagement and information transparency converge. Economic and technology changes suggest an opportunity for collaboration among these two historic community information centers — one largely public, one largely private. How?
I’ve been heartened by a recent string of long-form journalism that’s been making a buzz, provoking change, and bringing attention and insight to important issues. As it happens, these pieces often draw heavily upon government information. Examples include the Washington Post’s series Top Secret America, outlining the growth of security and intelligence in a post-9/11 America; and The Runaway General, Rolling Stone’s profile of General Stanley McChrystal, which led to his firing for disparaging comments he and his aides made about the administration. In both cases, government information illuminates the exploration of current, pressing issues in the news.
I’ve often been frustrated with standard editorial practice of mentioning, but not completely citing, the particular documents referred to in newspaper articles. It masks the ubiquity of government information in our daily lives, and sets up a barrier to readers who might be interested in examining the original documents themselves (and can make it challenging for a librarian to track it down when the patron seeks assistance). For example, in The Transformer, Foreign Policy’s recent story on Secretary of Defense Robert Gates, which fostered speculation that he might retire before the end of Obama’s first term, author Fred Kaplan refers to a hearing before the Senate Armed Services Committee early in Obama's presidency in which Gates testified. This would be findable enough, but would require more tenacity than a casual reader might muster.
The Washington Post’s recent piece, How the Minerals Management Service’s partnership with industry led to failure, is a great example of journalists harnessing the possibilities of the online environment to enhance the reading experience and access to related documents. In this long piece on the too-cozy relationship between regulators and industry, the journalists not only tell readers exactly which documents they used in their reporting, they link to highlighted, annotated full-text of primary sources used in the story, such as a memo from the Inspector General to the Secretary of the Interior on investigations of MMS employees. This supplements the story by giving the reader routes for further exploration, as well as a genealogy of the story that gives more transparency to the journalism itself.
Creating an annotated map, pointing back to the primary documents used to inform a journalist’s narrative, would be a great exercise for students studying government information, journalism, librarianship, indeed citizenship, to raise awareness of the life cycle of government information and what can happen when it is unleashed in the public square.
DocumentCloud is a new service being developed with startup funding from the James L. Knight Foundation. It sounds like an excellent service. It will be software, a Web site, and a set of open standards that will make original source documents easy to find, share, read and collaborate on, anywhere on the Web."
I cannot help but wonder why libraries are not at the forefront of projects like this.
Started by reporters at the New York Times and ProPublica, this service will give individuals and organizations involved in original reporting mechanisms for sharing the documents they obtain and discover and making those documents available to other for new reporting and new uses.
Over two dozen organizations are working on the development of DocumentCloud, including traditional publications and news organizations such as The Atlantic, Chicago Tribune, Forbes, The Seattle Times, Thomson Reuters, Washington Post, and WNYC Radio, as well as organizations that collect and publish documents, such as The National Security Archive, ACLU National Security Project, OpenCRS, and the Sunlight Foundation,
Users will be able to search for documents by date, topic, person, location, etc. and will be able to do "document dives," collaboratively examining large sets of documents. Think of it as a card catalog for primary source documents. DocumentCloud is not meant to be a general document hosting service, like Scribd, Docstoc or Google Docs. Our goal is to build a service that makes source documents easier to find and share regardless of where they are hosted. It is a complement to these services, and not a competitor. the goal is to make documents even easier to find on search engines. DocumentCloud will have information about documents and relations between them, for example what locations, people, or organizations a group of documents have in common. Conceived of by journalists working at ProPublica and The New York Times, DocumentCloud will be managed as an independent nonprofit.
Their FAQ notes: "Will there be an API? Hell yes."
See also: Coming soon: Data mining made easier, By Alex Byers, Nieman Watchdog (July 11, 2009).
Project Censored, a media research project from Sonoma State University in California every year puts out a list of "news that didn't make the news." They've just released their 2010 edition (see below). I hope lots of people will go out and get a copy for themselves and their local libraries because this is what journalism is all about. It is the flip side of govt transparency as more available govt information makes for better and more thorough journalism.
- 1. US Congress Sells Out to Wall Street
- 2. US Schools are More Segregated Today than in the 1950s
- 3. Toxic Waste Behind Somali Pirates
- 4. Nuclear Waste Pools in North Carolina
- 5. Europe Blocks US Toxic Products
- 6. Lobbyists Buy Congress
- 7. Obama’s Military Appointments Have Corrupt Past
- 8. Bailed out Banks and America’s Wealthiest Cheat IRS Out of Billions
- 9. US Arms Used for War Crimes in Gaza
- 10. Ecuador Declares Foreign Debt Illegitimate
- 11. Private Corporations Profit from the Occupation of Palestine
- 12. Mysterious Death of Mike Connell—Karl Rove’s Election Thief
- 13. Katrina’s Hidden Race War
- 14. Congress Invested in Defense Contracts
- 15. World Bank’s Carbon Trade Fiasco
- 16. US Repression of Haiti Continues
- 17. The ICC Facilitates US Covert War in Sudan
- 18. Ecuador’s Constitutional Rights of Nature
- 19. Bank Bailout Recipients Spent to Defeat Labor
- 20. Secret Control of the Presidential Debates
- 21. Recession Causes States to Cut Welfare
- 22. Obama’s Trilateral Commission Team
- 23. Activists Slam World Water Forum as a Corporate-Driven Fraud
- 24. Dollar Glut Finances US Military Expansion
- 25. Fast Track Oil Exploitation in Western Amazon
[Thanks for the tip Crooks and Liars!]
An interesting new white paper contrasts "Public Media 1.0" (public broadcasting, cable access, nonprofit satellite set-asides) with "Public Media 2.0" (multiplatform, participatory, centered around informing and mobilizing networks of engaged users). It says that "the individual user has moved from being an anonymous part of a mass to being the center of the media picture."
- Public Media 2.0: Dynamic, Engaged Publics, by Jessica Clark and Pat Aufderheide, Center for Social Media, School of Communication, American University, Feb 2009. [pdf] (also available in an html version.)
Public broadcasting and other "public media" are facing challenges similar to those that newspapers and libraries are facing in the digital information age. This white paper attempts to re-envision public media, just as many people are trying to re-envision newspapers/journalism and libraries.
The paper focuses more on "content" creation and user-collaboration than on preservation of information, but it does acknowledge the need for funding for what it calls "curation" and archiving. It says that "Commercial platforms do not have the same incentives to preserve historically relevant content that public media outlets do."
The terms "curation" and "stewardship" are often used in discussion of long-term preservation and access to information, but different writers use the terms differently, even interchangeably. This leads to vague, conflicting, and confusing arguments. This white paper defines "curation" more as presentation and commentary than as preservation. In doing so, they miss an opportunity to address the issues of long-term, free, usable, public access to information.
Curation: Users are aggregating, sharing, ranking, tagging, reposting, juxtaposing, and critiquing content on a variety of platforms—from personal blogs to open video-sharing sites to social network profile pages. Reviews and media critique are popular genres for online contributors, displacing or augmenting genres, such as consumer reports and travel writing, and feeding a widespread culture of critical assessment.
Clark and Aufderheide do include libraries as one of the potential partners for public media projects along with other institutions in the nonprofit sector such as universities, museums, and issue-focused educational and social organizations. They note that these institutions have "assets" that "include archives and databases, issue expertise, legitimacy, and trusted brands."
This vision certainly fits in with what John Shuler has been describing in his series on libraries as centers for education and civic engagement. I think libraries looking for service ideas could get some good ones from this report.
But I also think that libraries will need to read beyond this report to find their unique role in society and in facilitating and participating in "Public Media 2.0." Libraries can fill the long-term preservation-and-use gap in the report. Specifically, civic participation needs trusted institutions to select, acquire, organize, and preserve information and provide that information in usable formats in an environment that encourages re-use and the kind of participation described in the white paper. Libraries need to concentrate on those "assets" -- not in the economic sense of private property that is owned and controlled for the benefit of the owners, but as valuable community property, managed and maintained for the community by information professionals.
For Public Media 2.0 to succeed and flourish, for citizens to be able to reliably "aggregate, share, rank, tag, and critique," society needs more than content creators (journalists, broadcasters, writers, analysts, etc.); it also needs institutions that guarantee access and usability of information. It needs libraries.
Thanks to Kevin Taglang, Editor, Communications-related Headlines, Benton Foundation for the pointer to this report!
Here is another non-government site about the economic bailout/recovery. This one is put together by investigative journalists. They are citing the stories they have done on the largest domestic spending bill in U.S. history and highlighting the best reporting from around the Web on the stimulus.
ShovelWatch is a joint project of the non-profit investigative outfit ProPublica, the morning news program The Takeaway and WNYC, New York’s flagship public radio station.
With investigative reporting, interactive features, and (not least) help from you, we’ll be tracking the stimulus bill dollars as they travel from Congress to your neighborhood. With your help, we’ll make sure that one of the biggest, fastest appropriations ever has a big, fast army to track whether it is well spent.
For those of you who follow international news, there is a new source of international journalism to check out, GlobalPost. It has a mission to "redefine international news for the digital age" (mission statement). It "will try to fill a void left by newspapers and network television, which collectively have pulled back sharply in deploying journalists abroad. If it's successful, GlobalPost will be one of the most spectacular against-the-grain stories since news companies began their accelerating revenue slide almost two years ago" (GlobalPost: A startup treads where big media retreat, By David Westphal, OJR, Jan. 8, 2009).
For more background, see the Executive Editor's blog and his links to news coverage of yesterday's launch of the GlobalPost.