<
 
 
 
 
>
hide You are viewing an archived web page, collected at the request of Cornell University Library using Archive-It. This page was captured on 16:45:45 Feb 26, 2016, and is part of the Cornell University Archives Web Archive collection. The information on this web page may be out of date. See All versions of this archived page. Loading media information

Abstracts

Stephen ABRAMS, Harvard University Library

The format of a digital object must be known in order to interpret the information content of that object properly. Strong format typing is therefore fundamental to the effective use, interchange, and preservation of all digitally-encoded content. In terms of the OAIS reference model, format typing is a component of an object's representation information. Formats themselves also have representation information--primarily, the set of syntactic and semantic rules for encoding content into digital form--that must be preserved to address the concern raised by the Library of Congress's recent planning report, Preserving Our Digital Heritage: "Longevity of digital data and the ability to read those data in the future depend upon standards for encoding and describing, but standards change over time." The Andrew W. Mellon Foundation has funded an effort by the Harvard University Library to create a Global Digital Format Registry (GDFR) that will provide preservation practitioners with sustainable services to store, discover, and deliver representation information about digital formats. This presentation will provide an update on GDFR project activities.

Reinhard ALTENHÖNER and Tobias STEINKE, Deutsche Nationalbibliothek (DNB), the National Library of Germany

Data for the future - the kopal project -- The German project kopal develops a cooperative used long-term digital archive. The archival system of kopal could be used by different clients, just like a bank account. It is hosted by an experienced service provider and based on software by IBM. Its interfaces are open defined in a way, that it is possible to use open software and standard object formats to connect to them. As a result of the project an open archiving and exchange format for digital objects was specified, called Universal Object format. There is also an open source software library called koLibRI to build up objects, generate technical metadata and manage the communication with the OAIS-compliant system. Migration and emulation will be supported, but especially migration is an essential part of koLibRI and the concept of technical metadata within the data management. The archival system is ready to be used by the project partners and ca. 10 TB of data will be ingested until the end of the year 2006

Svein Arne BRYGFJELD - Senior Advisor, National Library of Norway

The Norwegian Digital Radio Archive -- The Norwegian Broadcast Corporation (NRK, www.nrk.no) and the National Library of Norway has for six years been working together to digitise the historical radio archive of NRK. The project aims at preservation, professional re-use, and general access. More than 50.000 hrs (170.000 programs) have now been digitised in high quality. The archive is built within the National Library of Norway, but serves as the every-day radio archive for NRK with their headquarters 1000 kms away. The archive is also now the repository where the National Library of Norway stores copies of the daily radio broadcast in Norway, enabling the archive to be updated on a daily basis. The archive is also integrated with FEIDE (www.feide.no), a federated authentication regime for the research and education sector in Norway. Based on the use of FEIDE, the National Library of Norway has established a role-based access control for the archive.

Priscilla CAPLAN - Florida Center for Library Automation

The Florida Digital Archive (FDA) is a digital preservation repository run by the Florida Center for Library Automation (FCLA) for the use of the eleven public universities of Florida. The FDA went into production in November 2005. It ingested 108,607 files (2.2 TB) in the first 6 months of operation. The FDA uses a locally developed software application called DAITSS to support the repository functions of ingest, data management and dissemination, and the preservation functions of format normalization and format migration. Functionally, DAITSS consists of a set of services. The Ingest Service performs format validation, extraction of technical metadata, creation of derivative files through normalization and/or migration, and storage preparation. Storage Maintenance assures that stored masters remain good copies on readable media. The Access Service is responsible for access control, reporting, and building Dissemination Information Packages in response to requests. Other services include Withdrawal and Repository Management. The system does not support discovery functions for end-users or real-time online access to archived materials. It can, however, be used as a preservation back-end to institutional repositories, publishing systems, digital libraries, or other user oriented applications. Following the OAIS reference model, DAITSS assumes a community of Producers who select, prepare and submit materials for preservation. In the FDA context, these are the libraries of the public universities of Florida, which have negotiated archiving agreements with the repository. To date, most submissions have been preservation masters from local digitization projects, and electronic theses and dissertations (ETDs). This presentation will give an overview of the Florida Digital Archive and the underlying DAITSS application, which is targeted for release as Open Source Software in 2006.

Robin L. DALE -- Program Officer in RLG-Programs

Over the last three years, the RLG - NARA Task for on Digital Repository certification has worked to develop a checklist which would allow objective evaluation of digital repositories and archives. The checklist was based upon international standards and community best practices for digital preservation as well as influences from information technology and information security standards. The goal was to develop metrics to evaluate three aspects influencing repository trustworthiness: the characteristics of the archiving organization that affect performance, accountability, and business continuity; technologies and technical infrastructure employed by the repository; and finally the processes and procedures utilized by the repository. A draft checklist was released for public consumption and comment in August 2005, leading to further development. In addition, the checklist has been leveraged by the Center for Research Libraries Auditing and Certification of Digital Archives project and used in test audits of three digital repositories and archives. Through public comment, practical application, and other CRL project activities, we have made further progress toward the development of a fully-fledged program of audit & certification for digital repositories. This presentation will discuss checklist development, envisioned uses of the checklist, the emerging audit & certification process, potential audit & certification process outputs, and compliance issues.

Susanne DOBRATZ - Head Electronic Publishing Group, Humboldt-University Berlin   

The Criteria for Trusted Digital Long-Term Preservation Repositories ( urn: nbn:de:0008-2006060710 or http://edoc.hu-berlin.de/series/nestor-materialien/2006-8/PDF/8.pdf ) have been developed by the nestor project and are now open for public comment. This criteria catalogue primarily addresses cultural heritage organizations, federal archives, libraries and museums and is designed as a guideline for the planning and setup of a digital long-term preservation repository. Secondarily, this catalogue can be an orientation guide for software developers, third party vendors or service provides from the private sector. The nestor catalugue focuses on its applicability in Germany . Still, it must be discussed internationally and should adhere to international standards.

Eileen FENTON - Executive Director, Portico

The work of the academy – research and teaching – is not possible without reliable access to the accumulated scholarship of the past. One component of this scholarly record, academic journals, is increasingly electronic – and fragile – and its future accessibility is a growing concern. The recent statement "Urgent Action Needed to Preserve Scholarly Electronic Journals" endorsed by leading libraries and organizations such as ARL, ALCTS and others underscores the urgency of this community need. But the scale and complexity of the technology infrastructure, specialized expertise and quality control processes necessary to preserve electronic resources exceeds that which can be supported by any individual library or institutional budget. This presentation will provide a brief history of Portico, the not-for-profit electronic archiving service developed in response to the library community's need for a robust, reliable means to preserve electronic scholarly journals. Portico was initiated by JSTOR and has been developed with the initial support of Ithaka, The Andrew W. Mellon Foundation, and the Library of Congress. Portico's mission is to preserve scholarly literature published in electronic form and to ensure that these materials remain accessible to future scholars, researchers, and students. In addition to an overview of the Portico service and access model developed with input from publishers and libraries, the presentation will include an update on library and publisher participation and the status of Portico's archival operations.

Carl FLEISCHHAUER - Project Coordinator, Office of Strategic Initiatives, Library of Congress

This presentation will compare and contrast the audio-visual reformatting carried out in memory institutions with the production, distribution, and archiving of born digital content by record labels, television broadcasters, and motion picture producers. For sound recordings, digital practices have been established in both memory institutions and the industry, although issues regarding multi-track productions are troublesome for the industry. For video, digital approaches are widely used for reformatting by libraries and for new broadcast production, but most video masters continue to be recorded on conventional digital videotape rather than as media-independent digital file formats. For theatrical films, most memory institutions continue to reformat using proven photo-chemical approaches.  Meanwhile, the production of new theatrical films is genuinely hybrid: Hollywood's current workflow mixes digital and film-based stages and elements. It is also the case, however, that high costs and the lack of relevant standards prevent motion picture producers from identifying and implementing a clear-cut digital solution for the long-term archiving of their valuable production assets. In all three areas--sound, video, and cinema--memory institutions and the industry face similar problems. Will they find ways to share their investigations and develop common solutions?

Patricia GALLOWAY - School of Information, University of Texas at Austin

Repository Institutionalization: What makes it worth becoming infrastructure? Since the advent of inexpensive or open-source institutional repository software, libraries and other institutions have attempted to wrestle with delivering services that could justify supporting the repository through chargebacks of some kind; yet a major problem has been persuading people to fill the repositories with content of adequate value to prospective user communities. Since 2003 we have been running DSpace repository software in the School of Information, University of Texas at Austin, at first simply as a testbed for student work but since spring of 2005 as an institutional archival repository. In the course of this experience, following the “seed-evolve-reseed” model developed by Gerhard Fischer at the University of Colorado for collaborative development of computing environments, we have begun to demonstrate the value of such a repository in several different directions: fulfilling the state of Texas statutory requirements for preserving official documents (in which the official version of administrative information is posted on the School’s website); providing faculty (and students) with preprint and postprint exposure of their work; providing secure storage for learning objects, including presentations and tutorials; providing secure archiving for digital materials of historical value both to the School and to the discipline; providing an environment for students to work on the problems of digital archiving, populated with objects that will over time and in their turn demonstrate further problems of digital archiving; and providing the Information Technology service with a secure archive for versions of installed software and content. These uses within the School alone are beginning to weigh securely enough in the balance to warrant commitment of funding for the provision of infrastructure to the tasks of administering and delivering educational services.

David GEWIRTZ - Project Manager and Digital Preservation Architect, Yale University; and Gretchen GANO - Public Administration and Government Information Librarian, New York University

A Preservation Model for Social Science Numeric Data Collections: PREMIS and FEDORA. This session will outline the workflow associated with migrating social science data collections into FEDORA, focusing on the implementation of PREMIS metadata as a component of the submission information package (SIP). Presenters will identify how the PREMIS data model serves to specialize the packaging of the SIP so that access aids can exploit the information package when it is transformed into an archival information package (AIP). Presenters will outline an example expression of PREMIS for social science datasets and will demonstrate how this metadata may be stored in a FEDORA repository. Data management issues including normalization will also be explored. Examples from the ongoing project to migrate the Yale Social Science Data Archive from a postgreSQL database into FEDORA will be provided.

Ann GREEN and Myron GUTMANN - ICPSR, University of Michigan

In developing and debating digital repositories, the digital library world has devoted more attention to their missions and roles in supporting access to and stewardship of academic research output than to discussing discipline, or domain, specific digital repositories. This is especially interesting, given that in social science these domain-specific repositories have been in existence for many decades. The goal of this presentation is to juxtapose these two kinds of repositories and to suggest ways that they can help build partnerships between themselves and with the research community. It is based on the fundamental idea that all the parties involved share important goals, and that by working together these goals can be advanced successfully. We will begin by characterizing the life cycle of social science research, before turning to key elements of the two different kinds of repositories, and then to our recommendation that researchers and the two different kinds of repositories can forge partnerships. The key message is that by visualizing the role of repositories explicitly in the life cycle of the social science research enterprise, the ways that the partnerships work will be clear. These workings can be seen as a sequence of reciprocal information flows between parties to the process, triggers that signal that one party or another has a task to perform, and hand-offs of information from one party to another that take place at crucial moments. This approach envisions both cooperation and specialization. The researcher produces the scientific product, both data and publications; the institutional repository has specialized knowledge of campus conditions and the opportunity to interact frequently with the researcher; and the domain-specific repository has specialized knowledge of approaches to data in a specific scientific field, for example domain-specific metadata standards, as well as the ability to give high-impact exposure to research products.

Terry EHLING - Center for Innovative Publishing, Cornell University Library presenting on behalf of Robert HANISCH - Space Telescope Science Institute

Digital Data Preservation and Curation: A Collaboration Among Libraries,
Publishers, and the Virtual Observatory.
Astronomers are producing and analyzing data at ever more prodigious rates. NASA's Great Observatories, ground-based national observatories, and major survey projects have archive and data distribution systems in place to manage their standard data products, and these are now interlinked through
the protocols and metadata standards agreed upon in the Virtual Observatory. However, the digital data associated with peer-reviewed publications is only rarely archived. Most often, astronomers publish graphical representations of their data but not the data themselves. Other astronomers cannot readily inspect the data to either confirm the interpretation presented in a paper or extend the analysis. Highly processed data sets reside on departmental servers and the personal computers of astronomers, and may or may not be available a few years hence. We are investigating ways to preserve and curate the digital data associated with peer-reviewed journals in astronomy. The technology and standards of the VO provide one component of the necessary technology. A variety of underlying systems can be used to physically host a data repository, and indeed this repository need not be centralized. The repository, however, must be managed and data must be documented through high quality, curated metadata. Multiple access portals must be available: the original journal, the host data center, the Virtual Observatory, or any number of topically-oriented data services utilizing VO-standard access mechanisms.

Robin HAUN-MOHAMED -- Director, Collection Management and Preservation, U.S.Government Printing Office

The U.S. Government Printing Office (GPO) is committed to ensuring permanent public access and preservation of U.S. Federal publications. Historically, preservation of the tangible materials has been handled in collaboration with the individual libraries participating in the Federal Depository Library Program. Advances in information dissemination and the need to preserve digital content has necessitated changes in processes associated with GPO's production and distribution of Federal publications. Working with U.S. Government agencies, depository libraries and other interested parties, GPO is moving to implement a life-cycle approach to publishing of Federal publications to ensure not only access, but preservation of the objects for the future. This presentation will focus on GPO's collaboration with other communities of practice and the development of FDsys, GPO's digital content management system.

William KEHOE and Adam SMITH, Cornell University Library; and Markus Enders, Goettingen State and University Library (SUB)

Bringing many tools together to build a system of co-operating OAIS's in
the MathArc project.
The MathArc project has created a protocol, software, and registry that enable multiple institutions to share and store digital objects in each other's OAIS repositories, regardless of the nature of each system's underlying repository. In the pilot version, the Goettingen State and University Library (SUB) and the Cornell University Library (CUL) are sharing, storing, and managing collections preserved in Goettingen's kopal system (based on DIAS) and Cornell's CUL-OAIS (based on aDORe). The tools and standards used to build the system are familiar to those working in the digital preservation field and have been described and presented in many places. They include METS, OAI-PMH, PREMIS, JHOVE, LOCKSS, aDORe, and the kopal version of DIAS. This presentation describes not how they work individually, but how they all work together in the MathArc system.

John Kunze, California Digital Library

Low-Risk Persistent Identification: the "Entity" (N2T) Resolver -- The N2T ("entity") identifier resolver addresses the same problem as URN, Handle, and DOI resolvers, but does so without complex or proprietary software components. N2T is lower-risk than those resolvers because it relies only on off-the-shelf open-source components, and it is the only resolver to acknowledge and address the "namespace splitting problem". N2T (Name-to-Thing) is both a persistent identifier resolver and a consortium of cultural memory organizations. The consortium has no fees or requirements, and merely offers its members the option to publicize a protected form of their URLs supported by the resolver. The resolver is a small, standard web server run in several mirrored instances by consortium volunteers under one hostname rented for about $30 USD per year. The resolver works equally well with any identifier scheme (URLs, ARKs, Handles, DOIs, URNs, PURLs) that can be expressed inside a URL.

Bronwyn LEE - Librarian and Business Analyst, National Library of Australia

Preservation metadata requirements for repositories: a project of the Australian Partnership for Sustainable Repositories (APSR) -- APSR aims to establish a centre of excellence in sustainable digital resource management and partner universities are developing demonstrator repositories built on sustainability principles. This paper presents the work of a project commissioned by APSR to specify requirements for the collection of metadata needed for long term continuity of access to digital collections. The project was called PRESTA (PREMIS Requirements Statement) but it took a broader view than PREMIS alone. The MetaArchive of Southern Digital Culture will discuss the first two years of deliverables (2004-2006) for their three year partnership for establishing a collaborative digital preservation network for southern cultural heritage materials.

Robert McDONALD - Associate Director of Libraries for Technology and Research, Florida State University, and Tyler WALTERS - Associate Director for Technology and Resource Services, Georgia Institute of Technology

The MetaArchive of Southern Digital Culture will discuss the first two years of deliverables (2004-2006) for their three year partnership for establishing a collaborative digital preservation network for southern cultural heritage materials. The MetaArchive of Southern Culture is a multi-institutional partner with the Library of Congress in their National Digital Information Infrastructure and Preservation Program (NDIIPP) and has established a distributed model for digital preservation. Headed by Emory University, this test-bed implementation network involves six different academic research libraries (Auburn University, Emory University, Georgia Institute of Technology, Florida State University, Virginia Polytechnic Institute and State University, University of Louisville) as well as a strategic alliance with the Stanford University based LOCKSS Program and the Library of Congress.

During the first two years of this project the MetaArchive partnership has implemented a collaborative digital preservation network, has tested the LOCKSS toolset for scale (3 terabytes), for ingest (HTTP and OAI-PMH), and for reliability (total system failure). The group has also developed a collection-level conspectus that inter-operates with LOCKSS ingest plug-ins and collection selection policies of the group. Included in this presentation will be a discussion of the collaborative partnership agreements and the forthcoming work on the next phases of the project which include establishing a non-profit entity that specializes in digital preservation networks for cultural heritage institutions, creating a formalized business model to advance our collaborative paradigm outside of our Library of Congress partnership, to conduct document format tests involving LOCKSS on-access migration and batch migration strategies, and to implement a preservation management framework that incorporates other open-source preservation software tools capable of
integrating with our current LOCKSS-based network.

Heike NEUROTH - Göttingen State and University Library (SUB), Germany

e-Science and Preservation - A Perfect Match: The hard sciences have worked for many years towards global collaboration on an infrastructural and workflow level. The concepts and technologies eveloped in this venture are increasingly being adopted by other disciplines including the social sciences or the arts and humanities. In Europe these disciplines aim to establish an integrated e-Science landscape to benefit from the existing resources and experiences attained by the "hard sciences". Current humanities projects under the e-Science umbrella are on a promising route. The preservation community could benefit greatly from the e-science community in the strive for - amongst other - sharing storage resources while maintaining local autonomy; interoperability and resource integration on a semantic level; adaptability for long-term stability; sharing services and synchronising workflows. As part of its mission as a national coalition for digital preservation, Nestor II aims to network the e-Science and preservation communities and further the transfer of concepts and tools between them.

Erik OLTMANS - Head Acquisitions and Cataloguing, National Library of the Netherlands

Electronic journals dominate the field of academic literature, and it is of great importance to the international scientific community that this electronic intellectual output remains accessible in perpetuity. It is uncertain, however, whether the traditional model, based on national deposits and geographical frontiers, will be able to guarantee the long-term safety of the international academic output in a digital form. Academic literature is produced by multinational publishers, and has often no longer a country of origin that can be easily identified and thus no obvious guardian. Hence, in the traditional model there is a huge risk of academic records being lost forever. A systematic and more concentrated approach is needed to address this unacceptable risk. In this presentation we discuss the policy and ambitions of the National Library of the Netherlands (Koninklijke Bibliotheek, KB) regarding digital archiving of electronic publications. We discuss three possible threats against permanent access, and we propose a coordinated and systematic approach to address these risks: the Safe Places Network. This paper also includes a comprehensive overview of the e-Depot system and the KB approaches to digital preservation.

Sandy PAYETTE -- Digital Library Research and Development, Information Science program at Cornell University

Fedora: Complex Objects, Information Networks, and the Challenges of Digital Preservation -- We are at a point where the scope, definition, and uses of digital libraries and institutional repositories are changing along with the process of creating and disseminating scholarly and scientific information.   It is not enough to just store documents, datasets, images, and other resources in repositories where they can be searched and accessed.   The networked landscape in which we live inspires more and calls upon us to develop information architectures that promote (1) the creation of complex digital objects that consist of both local and remote content, (2) the ability to interconnect digital objects that reside in distributed repositories, (3) the ability to re-use objects or their components in the creation of new resources.  All of these requirements have implications for how we build digital libraries, repositories, and scholarly information systems in the future, and they also present significant challenges for digital preservation. In this talk I will discuss how the Fedora Project is approaching the digital preservation problem. In particular, I will highlight key aspects of the Fedora repository system and service framework that are preservation-enabling. I will also review current work undertaken by the Fedora Project in collaboration with the Fedora Preservation Working Group. This work entails the use of message-oriented middleware to facilitate alerting of preservation-noteworthy events and the development of services to respond to such events.

Stephen RANKIN - CASPAR Project

Virtualisation of Simple Scientific Data Objects.Capturing OAIS defined representation information in a standardised way is critical for the preservation and future reuse of scientific data. The structure of a scientific data object needs to be defined so that a future user can map the data bits to the actual scientific data. The semantics associated with the scientific data also needs to be defined so that the data can be understood and used by a user from the appropriate designated community. This presentation will show how simple scientific digital objects (tables, images etc.) can be described and "virtualised" by using representation information in the form of EAST file format descriptions (structure) and the corresponding DEDSL data dictionaries (semantics). It will also be shown that in some real life cases the EAST and DEDSL standards need to be extended so that they can fully describe the simple objects. Tools and APIs will be demonstrated that take the structure and semantic definitions for a simple scientific digital object and automatically read the data it contains and render it in the appropriate way.

Victoria REICH - Director LOCKSS Program, Stanford University Libraries

People with responsibility for scholarly assets agree that digital preservation is important. Tomorrow's readers will need today's materials; without preservation they won't exist. Librarians and publishers are asking two fundamental questions: From this moment on, who will have custody of society's electronic information? From this moment on, who will control and govern society's electronic archival assets? With LOCKSS over 150 libraries and over 80 publishers are working together to ensure no one organization has control over our intellectual heritage. By ensuring libraries can build collections and retain their role as long-term memory organizations in the electronic environment LOCKSS avoids the social hazards of centrally controlled information. With CLOCKSS 12 large publishers and seven libraries are working towards similar goals. For over 8 years the OAIS-compliant, format-agnostic, open source LOCKSS system has been demonstrating it can ingest a wide range of web content, audit and repair it to ward off damage and attacks, and transparently deliver it to readers with transparent on-access format migration. [www.lockss.org]

Seamus ROSS, Director HATII University of Glasgow and Associate Director of the UK's Digital Curation Centre and Andrew McHUGH, Audit and Advisory Manager of the Digital Curation Centre

Preservation Pressure Points: Evaluating Diverse Evidence for Risk Management -- Establishing a comprehensive understanding of the effectiveness and trustworthiness of a digital repository requires a broad range of evidence. Preservation can be considered as a complex spatial and chronological network of challenges, and associated risks. For example the organisational, financial, technological and operational contexts within which a repository resides and the extent to which it is capable of managing them must be ascertained if an audit is to be able to assert the likelihood of the institution's success. Significant effort must be directed towards the definition of methodologies for identifying appropriate classes of evidence, to their evaluation, and to the attaching of weight to them. Formal means are required to facilitate the analysis and comparison of disparate evidence types in order to enable auditors to accommodate a diverse range of physical, testimonial and experiment-based proof. In addition to binary systems of inquiry (e.g., does the organisation have a mission statement?) auditors must display an ability to distinguish the most persuasive examples from those that provide less substantive evidence of organisational competence. Similarly, if, for instance, a significant proportion of staff reveal that they have no idea of the content of their organisation's mission statement then this must be reflected in the overall organisational assessment. A comprehensive insight, and consequent decision, can only be reached after fully exploring the evidential basis upon which compliance is to be founded. This discussion of evidential appraisal techniques for repository audit reflects the series of pilot audits undertaken by the Digital Curation Centre within a selection of UK data centres and archives, including the Beazley Archive and the British Atmospheric Data Centre.

Barbara SIGNORI - Swiss National Library

Web Archive Switzerland is a pilot project undertaken in collaboration with the Swiss Cantonal libraries as part of the e-Helvetica Project at the Swiss National Library (SNL). The mission of the SNL and the Swiss Cantonal libraries is to collect and archive all Swiss publications, both printed and electronic. The objective of the e-Helvetica Project is to fulfill this mission for electronic publications. The Web Archive Switzerland pilot project recently designed and tested a shared workflow for selecting, collecting, cataloguing, archiving and disseminating non-commercial Swiss web resources. Within the workflow the Cantonal libraries are responsible for the identification, selection and announcement of data on the Web. The SNL is responsible for the web-harvesting (objects and rights), cataloguing, archiving and dissemination of those identified web resources. The aim of this co-operation is to share resources and knowledge. This presentation will give some historical context to Web Archive Switzerland and discuss the collaboration with the Cantonal libraries and the shared workflow. The approach chosen for selecting, cataloguing and harvesting the web resources will be highlighted as well as our experiences to date. The presentation will conclude with some ideas on how we are planning to disseminate the web resources. For further information on the e-Helvetica Project (in German and French), see: <http://www.e-helvetica.ch>

Tim TAMMINGA - Endeavor Information Systems

For almost two years, the National Library of New Zealand (NLNZ),
Endeavor Information Systems and Sun Microsystems have been working together to design and implement a comprehensive digital preservation platform for New Zealand's National Digital Heritage Archive. NLNZ has developed the requirements for the platform and with Endeavor is completing the specifications. From that, Endeavor is building a components-based application platform (called Kronos) and Sun is developing the overall reference architecture. A goal for all three organizations is to create a solution that is generic and applicable for any library with large and varied types of digital content. An independent peer review group has periodically met with the three organizations to ensure that the Kronos design is widely applicable to other libraries and complies with international standards. Our presentation describes why the partnerships formed and how they have evolved. The concept of the Peer Review Group will be examined as a method of ensuring industry-wide compliance and as a mechanism for explicating and resolving tension between partners.

Dr. Kenneth THIBODEAU - US National Archives and Records Administration (NARA).

After a year long design competition, in September 2005 the National Archives and Records Administration awarded to Lockheed Martin Corporation a contract valued at $308 million to develop and operate the Electronic Records Archives (ERA) system. The system will enable a comprehensive transition of NARA’s governmentwide management of the records of the U.S. Government into the realm of e-government. It will automate records management processes, such as scheduling and appraisal, transfer of records to NARA, and destruction of temporary records. These functions will be applicable to records of all types, and support NARA’s different lines of business in the National Archives, Presidential Libraries, and Federal Records Centers. The system will also enable NARA to realize its vision of preserving and providing continuing access to any type of electronic record, free from dependence on the hardware and software originally used to create and store it, for as long as the records need to be retain, which includes for the life of the republic. The speaker will describe the architecture and processes for digital preservation and access in the ERA system.

David THOMAS - The National Archives, London, England

The UK scene -- David Thomas will focus on the Seamless Flow programme being undertaken at the National Archives of the UK. This is an ambitious plan to develop a fully comprehensive digital preservation service for the UK central government. It covers the selection of record for permanent preservation, their transfer to the National Archives and their storage, preservation and delivery to users. Crucially, the programme is doing work in two other spaces – the intermediate storage of digital material which has a long-term business value to government (such as files relating to radioactive waste) and resource discovery. Many of the hard lessons learned on this challenging programme have a wider application and he will be sharing these with you.

Rene van Horik - Thema-manager, Data Archiving and Networked Services (DANS)

DANS is the new Dutch organization, which has been tasked with the preservation and permanent access to research data in the humanities and the social sciences. The two main scientific administrative organizations in the Netherlands, the Royal Netherlands Academy of Arts and Sciences (KNAW) and the Netherlands Organisation for Scientific Research (NWO), are jointly responsible for DANS. DANS aims to promote (interdisciplinary) collaboration with and among researchers. DANS is also the Dutch national partner in European and other international data organizations. DANS is also an intermediary to researchers for the delivery of large databases from organizations such as Statistics Netherlands, The Social and Cultural Planning agency and the Topographical Service. Already existing data archives like the social science data archive 'Steinmetzarchief' and the Netherlands Historical Data Archive (NHDA) have been merged in DANS. DANS has been mandated to expand and improve the current data research infrastructure. New initiatives for data services and data archives are being developed in fields where such facilities have been lacking until recently. These TOPical Programmes (TOPs) are set up in collaboration with research groups. Work on the renewal of the infrastructure for digital data archiving is in progress. Innovation and staying ahead in the ICT-development is one of the main spearheads of this new organization. To be able to apply fundamental research findings to systems used for archiving and providing access to data DANS has established its own development team. For the acquisition of data sets, collaboration with faculties, research schools and academic institutes will be invigorated. DANS developed in collaboration with researchers in the field and the Netherlands Ministry of Housing, Spatial Planning and the Environment (VROM) a user-friendly repository for research archiving. Now researchers are able to place their research data in DANS compliant data repositories. They can rely on a safe preservation and curation of their data an they can share their data with other researchers. DANS aims to make access to research data as open as possible.

Xiaolin Zhang – National Science and Technology Library, Beijing, China

Dr. Zhang will present the China Archives of Digital Scientific Literature (CADSL) Initiative, proposed to the Ministry of Science and Technology of China, as the result of a feasibility study by the National Science and Technology Library of China. Based on a need and crisis analysis, a distributed network of trusted archives with sustainable planning and administration mechanisms is suggested to preserve digital scientific information resources, including foreign published literature, home-grown literature, and scientific web resources. Outlined in the proposed initiative is (1) a distributed and shared system of responsibility to archive the targeted resources, (2) a framework of guidelines for intellectual property protection of preserved materials, (3) an open and continuous certification process for trusted repositories, (4) a rigorous set of outlines of technological, workflow management, service level agreement requirements for prospective archives, and (5) a public and yet competitive funding mechanism to utilize multiple resources for the initiative. Ways to mobilize research and educational institutions into the initiatives and future efforts to promote and collaborate with scientific data preservation and cultural heritage preservation are discussed in the end.