Home » Posts tagged 'Environmental data'
Tag Archives: Environmental data
The Environmental Data & Governance Initiative (EDGI) has just released its 2019 annual report. Check out what they’ve been doing over the past year in terms of archiving data, environmental data justice, interviewing and policy project, and website monitoring. props to EDGI for a year well-worked!
From the report…
EDGI’s Archiving Working Group continues to build on its grassroots Data Rescue efforts that involved events in over 40 cities and towns across North America and ended in mid-2018. Our archiving work has: ● Enhanced the public accessibility of downloaded data ● Established partnerships with software companies, QRI and Protocol Labs, to develop “Data Together,” a set of protocols and technologies for decentralizing data storage online ● Advanced a collaboration with Science 2 Action to build systems to better identify still vulnerable federal datasets and effectively copy them ● Launched the beta-version of our Environmental Impact Statement search tool in consultation with the Sabin Center for Climate Change Law at Columbia University.
Archiving is perhaps the most-changed of EDGI’s areas of work. A year ago, Archiving was the home of a lot of direct work: hosting large data-archiving events and building software tools to support the identification and storage of data. But in the last year, Archiving has become more reflective, quieter, and theory-focused. Archiving continues to hold the data that was harvested in previous years, but now the group gives most of its attention to thoughtful design of data archiving technologies. There are two main reasons for the shift in focus. One is highly pragmatic: the sheer bulk of volunteer labor required to continuously host events and build software tools was unsustainable. The second reason is more a mark of our organization’s maturation. EDGI’s core strength is not in its capacity to do work; rather, it is in its ways of being, doing, and thinking. EDGI’s unique value is its interdisciplinary site at the crossroads of justice, environment, data, and technology . As such, Archiving has been focusing on Data Together, an ongoing and inclusive conversation between EDGI and partners QRI and Protocol Labs, both of whom are building foundational technology for storing data in a decentralized internet. All of the partners think daily about data provenance and ownership and sharing models. The first annual Data Together meeting, in August 2018, yielded the Data Together mission: Data Together empowers people to create a decentralized civic layer for the web, leveraging community, trust, and shared interest to steward data they care about. The group also completed the first “semester” of a monthly reading group. Through carefully curated reading lists and 90-minute group discussions, the partners covered the topics of: the decentralized web; ownership; commons; centralization vs. decentralization vs. peer-to-peer or federation; privacy; and justice. This is a place for partners to seat their work in broad, theoretical contexts. We anticipate that the Archival functions within EDGI will continue to change as the organization continues to learn.
Drop everything and watch this presentation from the 2017 Code4Lib conference that took place in Los Angeles March 6-9, 2017. Heck, watch the entire proceedings because there is a bunch of interesting and thoughtful stuff going on in the world of libraries and technology! But in particular, check out Matt Zumwalt’s presentation “How the distributed web could bring a new Golden Age for Libraries” — after submitting his talk, he changed the new title to “Storing data together: the movement to decentralize data and how libraries can lead it” because of the DataRefuge movement.
Zumwalt (aka @FLyingZumwalt on twitter), works at Protocol Labs, one of the primary developers of IPFS, the Interplanetary File System (IPFS) — grok their tagline “HTTP is obsolete. It’s time for the distributed, permanent web!” He has spent much of his spare time over the last 9 months working with groups like EDGI, DataRefuge, and the Internet Archive to help preserve government datasets.
Here’s what Matt said in a nutshell: The Web is precarious. But using peer-to-peer distributed network architecture, we can “store data together”, we can collaboratively preserve and serve out government data. This resonates with me as an FDLP librarian. What if a network of FDLP libraries actually took this on? This isn’t some far-fetched, scifi idea. The technologies and infrastructures are already there. Over the last 9 months, researchers, faculty and public citizens around the country have already gotten on board with this idea. Libraries just have to get together and agree that it’s a good thing to collect/download, store, describe and serve out government information. Together we can do this!
Matt’s talk starts at 3:07:41 of the YouTube video below. Please watch it, let his ideas sink in, share it, start talking about it with your colleagues and administrators in your library, and get moving. Government information could be the great test case for the distributed web and a new Golden Age for Libraries!
This presentation will show how the worldwide surge of work on distributed technologies like the InterPlanetary File System (IPFS) opens the door to a flourishing of community-oriented librarianship in the digital age. The centralized internet, and the rise of cloud services, has forced libraries to act as information silos that compete with other silos to be the place where content and metadata get stored. We will look at how decentralized technologies allow libraries to break this pattern and resume their missions of providing discovery, access and preservation services on top of content that exists in multiple places.
A custom-built app will soon make it easy for anyone check government URLs to determine if they have been archived outside of government control and, if not, submit them for archiving.
- Guerrilla archivists developed an app to save science data from the Trump administration, by Zoë Schlanger, Quartz (February 09, 2017).
The article in Quartz also reports on “data rescue events,” all-day archiving marathons, that have been held in Toronto, Philadelphia, Chicago, Indianapolis, Los Angeles, Boston, New York, and Michigan. Scientists, programmers, professors and digital librarians are meeting to “save federal data sets they thought could be altered or disappear all together under the administration of US president Donald Trump.” Jerome Whittington, a professor at NYU who organized one data rescue event, said about data on air and water pollution, contaminated soil, toxic spills, and the violation of rules against dumping harmful waste, “If you don’t have the data, you’ll be told your problem doesn’t exist. It is in a way a struggle over what we consider reality.”
DoE’s Carbon Dioxide Information Analysis Center (CDIAC) shut down without comment. Data in preservation danger
This is terrible. The US Department of Energy (DOE) has summarily shut down the Carbon Dioxide Information Analysis Center (CDIAC), located at the Oak Ridge National Laboratory (ORNL) as of 10/1/2016. CDIAC is the primary climate change data and information analysis center for DOE. CDIAC is supported by DOE’s Climate and Environmental Sciences Division within the Office of Biological and Environmental Research (BER).
A friend reports that CDIAC has limited funding and is trying to save its data in the NASA Distributed Active Archive Center (DAAC). There has been no outside comment and neither DOE nor ORNL have yet to issue a press release.
NOTICE: CDIAC as currently configured and hosted by ORNL will cease operations on September 30, 2017. Data will continue to be available through this portal until that time. Data transition plans are being developed with DOE to ensure preservation and availability beyond 2017.
This is absolutely tragic. In 2012, when Canada’s Harper government announced that it would close down national archive sites around the country, they promised that anything that was discarded or sold would be digitized first. However, reporting coming out of Canada now is finding that, along with the closure of some of the world’s finest fishery, ocean and environmental libraries, a significant amount of irreplaceable collections and data are simply being thrown out or burned. As David Rosenthal noted in his blog post “Threat Model for Archives”, this should be a giant warning to anyone who thinks that “single, government-funded archives are a solution to anything.”
Hutchings said none of the closures has anything to do with saving money, due to the small cost of maintaining the collections. He, like many scientists, concludes that Harper’s political convictions are driving the unprecedented consolidation.
“It must be about ideology. Nothing else fits,” said Hutchings. “What that ideology is, is not clear. Does it reflect that part of the Harper government that doesn’t think government should be involved in the very things that affect our lives? Or is it that the role of government is not to collect books or fund science? Or is it the idea that a good government is stripped down government? ”
Hutchings saw the library closures fitting a larger pattern of “fear and insecurity” within the Harper government, “about how to deal with science and knowledge.”
That pattern includes the gutting of the Fisheries Act, the muzzling of scientists, the abandonment of climate change research and the dismantling of countless research programs, including the world famous Experimental Lakes Area. All these examples indicate that the Harper government strongly regards environmental science as a threat to unfettered resource exploitation.
“There is a group of people who don’t know how to deal with science and evidence. They see it as a problem and the best way to deal with it is to cut it off at the knees and make it ineffective,” explained Hutchings.