Month of October, 2012
Here's something to add to the 'ol RSS reader (or twitter @crunchgov if that's your thang. TechCrunch, one of the better sites for news and information about tech and the tech industry, today launched CrunchGov to track on government and tech policy-making. The site will have 3 three initial CrunchGov products (report card, policy database, and legislation crowdsourcing). Read more about it on their post explaining the CrunchGov roll-out as well as their methodology/FAQ behind the site.
Welcome to TechCrunch’s tech policy platform, CrunchGov, a portal for sourcing the most thoughtful people and ideas to facilitate more informed policymaking. Currently, it consists of three areas: a congressional report card, a database of technology legislation, and a crowdsourced legislative utility for contributing ideas to pending bills.
In the wake of mass online protests against the Stop Online Piracy Act (SOPA), officials were eager to learn more about the concerns of those who work in technology and find ways to craft more informed policy. CrunchGov is our attempt at helping policymakers become better listeners, and technologists to be more effective citizens.
Randall Munroe has outdone himself. XKCD, the "webcomic of romance, sarcasm, math, and language," just posted another amazing, wall-sized infographic, this one depicting the historical ideological swings of left, right and center of the US Senate and House of Representatives (here are Randall's other *huge* and hugely fascinating infographics).
Be sure to read the side boxes and especially the one on methodology of how ideology was calculated. He meticulously accounts for the historical shift in the left/right spectrum between Republicans and Democrats.
That is all.
This posting covers a fascinating area of activity for access organizations like the National Security Archive – the international freedom of information movement. Toby McIntosh, a colleague and expert who edits “freedominfo.org,” a FOI clearinghouse sponsored by the Archive, is my co-author on today’s piece, which gives a broad overview of transparency developments overseas.
[By the way, this is our last posting about the National Security Archive. It’s been a pleasure to be a guest blogger this month and I’m grateful to James Jacobs for the invitation. Hope to hear from you or see you at Gelman Library at George Washington University!]
* * * *
STATE OF PLAY
Most Americans would likely agree that the right of access to government information is a cornerstone of our political system. But it would probably surprise a lot of people to know that the U.S. was not the first country to inscribe the concept into law. That honor goes to Sweden whose Parliament – in 1766 – adopted “His Majesty’s Gracious Ordinance Relating to Freedom of Writing and of the Press,” which provided for abolishing political censorship and securing public access to government documents. Exactly 200 years would pass before the United States would enact the Freedom of Information Act.
The U.S. was still relatively early to the game. Only Finland (which was actually part of Sweden when the first act was passed) approved a similar law before us – in 1951. A handful of other European states followed in the 1970s, and by 1990 there were 14 members in the club. But the years since the fall of the Berlin Wall saw that number rocket upwards.
Today there are 93 countries with freedom of information acts – known also as right-to-know (RTI), or access-to-information laws. Most of these countries – 38 – are in Europe; 22 are in the Americas. Asia has 18; Africa has nine; and the Middle East and Oceania three apiece. (Several of these countries, unlike the U.S., have even put the concept in their Constitution.)
There are multiple reasons for this global blossoming of openness. In some (mostly democratic) countries, like Japan, scandals like the Lockheed bribery case helped drive the process. In Thailand, South Africa and elsewhere, access was part of a broader dynamic of political, economic or educational reform. The collapse of Soviet-led communism from 1989-1991 was a major impetus, prompting several former socialist states to adopt statutes to open their secret histories and help put their pasts behind them.
The individual instigators in different countries were equally diverse, ranging from civil society groups pressing for stricter environmental enforcement or pro-consumer or anti-corruption measures, to parents trying to make school systems operate more fairly.
Two of the biggest international FOI success stories have been India and Mexico.
With legal debate on the issue stretching back to a Supreme Court ruling in 1975 (i.e., that access to information is a fundamental right), India finally passed the Right to Information Act in 2005. A wide-ranging law, its written provisions and implementing measures are often highly creative in the ways they deal with the circumstances facing average citizens.
In the state of Bihar, for instance, where literacy rates are below 50% but cell phone penetration approaches 70%, local authorities created a 24/7 call center to allow the filing of RTI requests. Similarly, with the Internet accessible to only 10% of the population, local government procurement data is literally put up on walls in public areas for all to see. (Unlike in the U.S., India’s 30-day deadline for a response means something. If agencies don’t comply, they get phone calls from RTI authorities demanding that they follow up.)
Mexico’s access to information law, passed in 2002, has turned into a global model, setting a new international standard for transparency by creating a Federal Access to Information Institute (IFAI), that implements and oversees the law at the national level, and Infomex, a Web site that lets users file information requests electronically. Over 300,000 requests have been submitted since the law was implemented.
These cases are not entirely representative, unfortunately. Getting access laws passed and ensuring they have adequate muscle has been anything but smooth sailing. Government and civil activists face persistent challenges trying to beat back pressures from central authorities, the military, local bureaucrats, or wealthy business interests. Even developments like the war on terror have threatened progress on the openness front (not least in the USA).
Current struggles to get new laws through parliaments are underway worldwide, with hotspots including the Philippines, Ghana and Sierra Leone. Key points of debate usually center around the scope of the law’s coverage, the strength of the exemptions, the time frames for responses, and the system for adjudication of disputes.
After passage of these laws, the controversy often continues. The Indian prime minister set off a firestorm recently by complaining about the “frivolous” use of the Right to Information Act. In Denmark, the government wants to amend its law to better protect materials developed during the policymaking process. And in Scotland, activists want more public-private partnerships covered.
THE ARCHIVE’S ROLE
In the late 1980s, when political ferment was afoot in Eastern Europe during the Gorbachev era, the National Security Archive received a visit from a small group of young political activists from Hungary. Their organization, FIDESZ, wanted to know how to make a freedom of information process work in their country – looking ahead with characteristic optimism (but also great foresight) to the day when the communist regime in power for the previous four decades would finally teeter and fall. So they came to the Archive to hear our experiences, a visit that started a lively and extraordinarily fruitful partnership with similar groups across Eastern Europe and later the former Soviet Union.
In the years since, the Archive has become increasingly active around the world, following events in places as far-flung as South Africa, the Philippines, and Guatemala. By providing our own experiences as a civil society organization and also taking the lead in helping to bring like-minded groups together with FOI legal experts, we’ve worked to get local populations started on the complicated process of building their own information access institutions.
In Mexico, for instance, the Archive collaborates closely with scholars, lawyers, and openness advocates engaged in the public debate about the right to know. We bring international transparency activists to train Mexican NGOs on the effective use of FOI laws in advocacy work. We organize conferences to encourage network-building across the country. We also encourage the news media to monitor government transparency programs and to use FOI laws in pursuit of breaking news stories.
In the former Soviet Union, Archive staff have supported a series of FOIA advocacy groups from St. Petersburg to the Caucasus in their efforts at monitoring, education, and legal work surrounding new pieces of access legislation that have been adopted in Russia and neighboring countries. Their energetic campaign has featured filing lawsuits against the Russian Federal Security Service and applying to the Supreme Court of the Russian Federation in opposition to restrictions on materials on political repression in the Soviet Union. The Archive has also co-organized international conferences and training sessions for FOIA activists from Russia, Georgia, Armenia, Azerbaijan, Ukraine and Kazakhstan.
Campaigns for more and better FOI laws are only a part of the larger transparency picture.
For several years, the Archive has cooperated with human rights groups, ombudsmen, special commissioners, international courts, supreme courts and other official and civil society groups investigating and prosecuting human rights abuses. These efforts typically center around obtaining documentation (from U.S. and local government files) that can be used as evidence in those proceedings. Our staff has been active in a dozen countries, from Peru to Liberia to Indonesia to Spain, witnessing some remarkable results. In 2008, Archive-supplied documents and expert testimony helped convict Peruvian ex-ruler Alberto Fujimori of human rights abuses in the 1990s. These experiences are invaluable for stimulating local governments and groups to press for laws and procedures to open broad public access to their own hidden files.
One of the more significant areas of potential change currently relates to international financial/trade institutions (IFTIs) – from the World Bank to NATO. IFTIs are generally creatures of national central banks that have always been notoriously opaque. In 2003, freedominfo.org launched an initiative to measure, test, compare and ultimately increase openness within these institutions by publishing detailed reports on individual organizations, and thereby sparking a series of collaborations between freedom of information advocates and IFTI campaigners. Freedominfo.org’s continuing work and results can be found in a special section called “IFTI Watch.”
Finally, in September 2011, the Obama administration initiated the Open Government Partnership. The OGP is a “multi-stakeholder collaboration,” drawing in civil society organizations (of which the National Security Archive is one) as well as governments. Eight governments (Brazil, Indonesia, Mexico, Norway, Philippines, South Africa, United Kingdom, and the U.S.) initially endorsed an Open Government Declaration, then promulgated country action plans “to promote transparency, empower citizens, fight corruption, and harness new technologies to strengthen governance.” The OGP now has 57 member nations who have pledged to make commitments toward more open governance. (FreedomInfo.org has written about 100 articles on the OGP.)
With so many new developments on the international front, it’s becoming more of a challenge to keep track of all that is happening. This is a particularly critical issue for those who are working to spread the adoption of RTI laws. Knowing about best practices and being able to draw on the experiences of similarly inclined groups around the world are key to these efforts.
Freedominfo.org is geared toward keeping abreast of these issues. It also provides useful research materials for free distribution. (For the texts of laws, background documents, links to national organizations and country-specific articles, see the “Country Info” tab; or search by country name.)
There’s also an email that goes out to subscribers (no charge) once or twice a week on current news and research, and freedominfo.org’s Blog Roll provides a listing of more than 100 active blogs on FOI issues.
For the best listing of FOI-related conferences and events, see the one maintained by the Carter Center here:
FOI laws internationally vary considerably, but there are not too many broad comparative materials available. One valuable resource is by Toby Mendel, “Freedom of Information: a Comparative Law Survey,” published in many languages by UNESCO:
A country-by-country rating showing a wide variety in the quality of the legal framework of FOI laws has been done by the Centre for Democracy and Law and Access Info:
There’s plenty else out there. But we hope this material is a start, and we encourage you and your colleagues to learn more about the international FOI movement. Feel free to sign up with freedominfo.org or any of the other entities above, or write to us with questions.
Deputy Director and Research Director
The National Security Archive
 The World’s First Freedom of Information Act: Anders Chydenius’ Legacy Today, (Kokkola, Finland: Anders Chydenius Foundation, 2006), see www.chydenius.net.
 http://www.freedominfo.org/2012/10/93-countries-have-foi-regimes-most-ta.... At least one expert, David Banisar, counts 99, including countries and “jurisdictions” with “laws” and “regulations.” Download his latest map from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1857498.
 See the freedominfo.org Web site for details on developments across the world. For an excellent overview of global trends and issues that remains relevant, see Thomas Blanton, “The World’s Right to Know,” Foreign Policy, July/August 2002, pp. 50-58. On the former communist system, see Malcolm Byrne, “Freedom of Information in the Post-Communist World,” Problems of Post-Communism, Vol. 50, No. 2, March-April 2003, p. 56.
 Tom Blanton, presentation, “Access to Information and Accountability: A Global Context,” Woodrow Wilson International Center for Scholars, October 11, 2012. http://www.wilsoncenter.org/event/access-to-information-and-accountabili....
 See the Open Government Partnership Web site: http://www.opengovpartnership.org/about.
For those that missed the fall 2012 Depository Library Conference -- and for those who want to go back and check their notes -- you'll be happy to know that the DLC conference proceedings are now online! There were many informative and interesting sessions of course. But one in particular I'd like to highlight was Chris Brown's presentation, "Fiche Online: A Vision for Digitizing All Documents Fiche" (PDF). I'm excited to see that Chris Brown is moving ahead with this project as I've been thinking of a project similar to this for a long time -- and have been requesting purchase of a scanner able to do batch scanning for a few years in order to work on this (one of these days, that proposal will get funded!). But what really piqued my interest was when Chris mentioned that he'd like to change the mindset on digitization projects. He called for not only digitization, but the public sharing of metadata (he called it a "record distribution model"). In this model, digitizing libraries would make their records available via harvest/FTP or some other method and other libraries would then be able to ingest those records into their own discovery environments. I wholeheartedly agree!!
Chris' presentation and mind-shift proposal are connected to the following FREE O'Reilly webinar in which Pilar Wyman, the President of the American Society for Indexing (ASI), will discuss the very idea that Chris has proposed. Hope you can "attend"!
Adding Value with Metadata: Open up the Index
Friday, November 9, 2012
10AM PT, San Francisco
6pm - London | 1pm - New York | Sat, Nov 10th at 5am - Sydney | Sat, Nov 10th at 3am - Tokyo | Sat, Nov 10th at 2am - Beijing | 11:30pm - Mumbai
Presented by: Pilar Wyman
Duration: Approximately 60 minutes.
In this webcast presentation we'll explore new paths for reusing content metadata for discovery and recommendations. Indexes are one of the most detailed metadata sets available for your content, and can be used to search, recommend, explore, and create buyers for your publications.
We'll talk about:
- baseline metadata
- semantic markup
- whether you need controlled vocabularies across multiple publications
- displaying mashups of multiple indexes
- incorporating social input
About Pilar Wyman
Pilar Wyman is the President of the American Society for Indexing (ASI), the voice of excellence in indexing. A veteran freelance indexer with her own successful business, she is also an active member of the ASI Digital Trends Task Force, which was formed in 2011 to address the continuing and rapidly increasing evolution of book publishing from traditional print to eBook formats. The DTTF was a key player in the recent International Digital Publishing Forum (IDPF) inclusion of indexes in the EPUB standard, and continues to work with the IDPF Indexes Working Group. Within her own indexing and via the DTTF, Pilar and ASI are currently engaged with publishers, hardware manufacturers, and software developers to design and create smart indexes for the digital age.
NARA and NOAA join Old Weather Project to crowdsource transcription of historic naval ship weather logsSubmitted by jrjacobs on Wed, 2012-10-24 10:45.
According to today's press release from NOAA, the National Archives (NARA) and NOAA are teaming up and joining the Old Weather Project hosted at Zoonivers.org to crowdsource the transcription of historic ships' logs in order to extract critical environmental data. The Old Weather Project began over 2 years ago with British Royal Navy log books -- 16,400 volunteers have transcribed 1.6 million weather observations so far! Transcribed data produced by Old Weather volunteers will be integrated into existing large-scale data sets, such as the International Comprehensive Ocean Atmosphere Data Set (ICOADS). Human volunteers are so important in this case because Optical Character Recognition (OCR) technologies cannot currently recognize hand-written text.
Before there were satellites, weather data transmitters, or computer databases, there were the ship’s logs of Arctic sea voyages, where sailors dutifully recording weather observations. Now, a new crowdsourcing effort could soon make of the weather data from these ship logs, some more than 150 years old, available to climate scientists worldwide.
NOAA, National Archives and Records Administration, Zooniverse — a citizen science web portal — and other partners are seeking volunteers to transcribe a newly digitized set of ship logs dating to 1850. The ship logs, preserved by NARA, are from U.S. Navy, Coast Guard and Revenue Cutter voyages in the Arctic between 1850 and the World War II era.
Organizers hope to enlist thousands of volunteers to transcribe scanned copies of logbook pages via the Old Weather project with an eye to Information recorded in these logbooks will also appeal to a wide array of scientists from other fields – and professionals from other fields, including historians, genealogists, as well as current members and veterans of the U.S. Navy and Coast Guard.
[HT to Gary Price at InfoDocket for calling our attention to this project!]
One of the many bright spots of last week's Fall 2012 Depository Library Conference -- the notes and proceedings will soon be posted on the desktop -- was the announcement by the Government Printing Office (GPO) that GPO and US Department of the Treasury are partnering on a project to bring historic digitized Treasury publications onto the FDsys platform. This is a great step by GPO to provide a platform for Federal agencies to publish their historically relevant publications for better access to the public.
The U.S. Government Printing Office (GPO) and the U.S. Department of Treasury have partnered on a pilot project to make historical digitized content from the Treasury Library available on GPO’s Federal Digital System (FDsys). Through the pilot project, Treasury Reporting Rates of Exchange, 1956-2005, which list the exchange rates of foreign currencies based on the dollar, are now available on FDsys. Over the next year, additional historical documents within the Treasury’s library collection will be made available on FDsys through this pilot project.
Spammers Using Shortened .gov URLs, by Ravi Mandalia, Parity News (20 October 2012).
Cyber-scammers have started using the 1.usa.gov links in their spam campaigns in a bid to fool gullible users into thinking that the links they see on a website or have received in their mail or newsletter are legitimate US Government website.
Spammers have achieved these shortened URLs through a loophole in the URL shortening service provided by bit.ly. USA.gov and Bit.ly have collaborated thus enabling anyone to shorten a .gov or .mil URL into a trustworthy 1.USA.gov URL.
Congratulations Newark Public Library, Washington University in St Louis Library, and University of Buffalo Library for being named 2012 FDLP depositories of the year!
For the first time, the U.S. Government Printing Office (GPO) honored three extraordinary Federal Depository Libraries of the Year at the 2012 Depository Library Council Meeting and Federal Depository Library Conference.
One regional depository and two selective depositories received special recognition for going above and beyond to further the Federal Depository Library Program's (FDLP's) mission of ensuring the American public has free access to its Government's information.
The three libraries chosen this year have demonstrated extraordinary levels of service to expand access to Federal Government collections and services.
GPO is proud to honor:
* Newark Public Library (Newark, New Jersey)
* The Olin Library at Washington University (St. Louis, Missouri)
* The University at Buffalo Libraries (Buffalo, New York)
Acting Public Printer Davita Vance-Cooks presented the awards to the esteemed recipients, on behalf of GPO and the FDLP.
The Newark Public Library has served as the regional library for the other Federal depository libraries in the state of New Jersey for nearly 50 years. It was selected for making the best use of limited resources and continuing to provide excellent public services.
The Olin Library is being honored for providing training opportunities to other depository librarians in the area and for collaborating with their regional depository to ensure the needs of the populous St. Louis metro area are served.
GPO is recognizing the University at Buffalo Libraries for maintaining several services which provide Federal depository libraries valuable assistance in processing U.S. Government publications received through the FDLP.
"I commend the Newark Public Library, the Olin Library, and the University at Buffalo Libraries for their contributions to the FDLP and outstanding commitment to serving their communities," said Acting Public Printer Davita Vance-Cooks. "GPO thanks all of the Federal depository libraries for playing a critical role in providing and expanding public access to Government information."
This week’s posting, our third on the National Security Archive, describes an area of activity that may be of particular interest to the librarians among this site’s readers – the methodology we use to put together the Archive’s flagship publication series, the “Digital National Security Archive.” DNSA is a constantly growing, highly curated collection of declassified documentation covering topics in the history of U.S. foreign policy from the 1940s to the present. It is published by ProQuest. Today’s blog is written by staff Indexer Stacey Chambers, on behalf of the National Security Archive’s Production Team.
* * * *
The word “production” may evoke machinery and factory workers churning out widgets, but at the National Security Archive, the word refers to the careful analysis and description of documents for publication in the Digital National Security Archive (DNSA) and its print and microfiche counterparts.
DNSA currently consists of 38 aggregated, large-scale publications on a wide array of subjects – from nuclear history to the Cuban missile crisis to the Soviet invasion of Afghanistan; from the U.S. intelligence community to the military uses of space; and from U.S. policy toward its great Cold War rivals – the USSR and China – to America’s relations with a host of other countries: Japan, Korea, South Africa, Nicaragua, El Salvador, the Philippines and elsewhere. The collections average about 2,500 documents apiece, and altogether total more than 650,000 pages of declassified records to date. The front matter – along with the cataloging and indexing that our staff of three trained librarians produces – amounts to as much as 1,200 pages of printed text per set.
We in the National Security Archive’s Production Team do type mass quantities of words and characters into database records – but the data in question is the result of extensive research, editing, and review – and before that, the product of the expert selection, legwork, and meticulous preliminary cataloging of project analysts and their assistants.
Indeed, the life of a DNSA collection, or “set,” begins with an analyst. By the time the Production Team enters the picture, an analyst will have already spent three-to-five years (sometimes more) identifying and amassing a large pool of documents – arduously obtained through Freedom of Information Act requests, research at relevant archives and repositories, and occasional donations – before carefully choosing the records that will go into the final publication. (Each project has an advisory board of outside experts who are consulted about set content.) The analyst’s team will have also begun or completed work on an introductory essay, chronology, glossaries, and other invaluable front matter.
Drawing upon this information, a project analyst delivers a detailed briefing about the subject matter of the collection to the Production Team, to provide the context and scope that is so important to have in mind as we start our phase of the process. To supplement this briefing, indexers also take a “reading period” to absorb the content of analyst-recommended and well-indexed books that we consult continually.
Next, the analyst-selected boxes of paper documents, or scanned PDF documents viewable through specially designed repository software (originally produced by Captaris/Alchemy), arrive in the Production Team’s seventh-floor office at Gelman Library – already crowded with boxes of documents, reference works, and views of goings-on outside nearby George Washington University buildings – and Production Team members each claim batches of documents and set about cataloging them.
In doing so, we typically capture the following information from a document: its date (or estimated “circa date”); title (or “constructed title,” for those documents lacking a formal or descriptive title); originating agency; document type and number; highest official classification; length; excisions and annotations; bricks-and-mortar location or Web site from which the document was originally retrieved, if not through FOIA; keywords; and personal names and organizations cited. We also record information about the physical state of a document – for example, if it is missing pages or is difficult to read – and construct a brief, one-paragraph abstract, or précis, to describe the document’s content. We have refined these metadata fields over the course of 20 years of producing these sets and obtaining feedback from users.
To populate the foregoing fields in our Cuadra STAR bibliographic database, we consult a range of reference sources, from Wikipedia as a starting point – we love its articles’ External Links section – to the State Department’s Office of the Historian Web pages and prized copies of decades-old State Department and Defense Department telephone books – to the wealth of subscription reference databases freely available to us through GWU’s Gelman Library.
Further, we consult the Getty Thesaurus of Geographic Names to verify geographic terms, and the Library of Congress Authorities to verify personal names [e.g., the Korean name “Ahn, S.H. (Se Hee)”] and sometimes subjects – though we primarily base the concepts contained in our authority file on the United Nations Bibliographic Information System. (Our internally generated authority file currently approaches 56,000 entries.) Where the UNBIS is lacking for our purposes – particularly in military and intelligence terminology – we may consult military branch Web pages, hard copies of specialized encyclopedias, old volumes of the Europa World Year Book, or other authoritative sources to establish or update terms, while adhering to such general guidelines as our in-house cataloging manual, the Chicago Manual of Style, and of course, the common sense of a user’s perspective.
Despite the excellent resources at our disposal, we often encounter unknowns in cataloging – but find that solving a mystery in one document frequently solves those in others. For example, in many cases, information about the agency that produced a document is missing, but by examining the document’s context, FOIA or repository information, format, and the analyst’s expertise, we can reach an educated conclusion.
Similarly, we may also safely deduce the real name of a person to whom a misspelled name refers. In one fun example in which human indexing proves indispensable, while working on a collection of “The Kissinger Telephone Conversations: A Verbatim Record of U.S. Diplomacy: 1969-1977,” indexers encountered a memorandum of a telephone conversation in which the secretary who was secretly transcribing Henry Kissinger’s words recorded what she heard as “Nelson’s tongue,” when in fact they were talking about Mao Zedong! In such cases, we document in an internal-memo field the process we used to arrive at our decision. However, in cases when we have too many doubts due to too few context clues to verify facts, we must resort to entering a field value of “Origin Unknown,” or to leaving non-required fields blank.
The same principle applies to the document-level indexing and abstracting part of the Production Team’s work: one archival document often informs another. For example, a memorandum or quickly jotted note may not expressly state its context, but through our understanding of the context in which it was created, we add the words – in the subject or abstract field – that render documents on similar subjects retrievable. Nobody sits down at a meeting and declares, “Now let’s talk about human rights”; they just do it. So, our job is to grasp the context and determine the subject, especially when it is not explicitly stated.
On occasions when we cannot resolve a quandary ourselves, we may turn directly to the project analyst for answers – including to question whether a particular document belongs in a set. For example, in the forthcoming collection “Argentina, 1975-1990: The Making of U.S. Human Rights Policy,” analyst Carlos Osorio confirmed that even though a 1979 briefing memorandum did not mention Argentina specifically, the document should be retained because it showed how U.S. policymakers were shifting their attention to Central America.
Throughout the four-to-six months we typically spend producing a set, we will have taken several steps along the way to preserve quality: held regular terms meetings in an effort to control the consistency and usability of the set’s vocabulary; reviewed printouts of completed catalog records against the documents; addressed any outstanding copyright issues; and edited abstracts multiple times.
Once we have finished all indexing, abstracting, and reviewing, we proceed to a final quality-control phase involving lots of coffee and reading aloud from oversized sheets of paper, to ensure that records are in their proper order; and to resolve any errors, inconsistencies, or outstanding issues. The process begins with assigning each record a sequence number, dictated by the records’ correct arrangement by date, and then alphabetically by a series of descending elements. Then indexers work in pairs to verify the accuracy of each data element. When we’re done, we repeat the process to ensure that the correct catalog record is assigned to the document it describes. This process may from the outside appear tedious or even torturous, but it is needed to deliver a clean, finished product to our co-publisher, ProQuest.
Meanwhile, the Production Team will have also taken time to create and review hierarchical cross references among the set’s records, so that users are appropriately redirected – not only to broader or narrower terms used in the set, but also from commonly used acronyms or plain-language terms to the set’s controlled vocabulary. Also during this wrap-up stage, the Production Director and Publications Director will have been generating last-minute lists to be checked, editing the set’s vital front matter in collaboration with the project analyst and Research Director, and tending to other publishing demands ... that is, until the analyst calls with “a few more essential documents … ”
The National Security Archive