Tuesday, June 28, 2016

Protecting the Long-Term Viability of Digital Composite Objects through Format Migration

Protecting the Long-Term Viability of Digital Composite Objects through Format Migration. Elizabeth Roke, Dorothy Waugh. iPres 2015 Poster. November, 2015.
     The poster discusses work done at Emory University’s Manuscript, Archives, and Rare Book Library to "review policy on disk image file formats used to capture and store digital content in our Fedora repository". The goal was to to migrate existing disk images to formats more suitable for long-term digital preservation. Trusted Repositories Audit & Certification (TRAC) requires that digital repositories monitor changes in technology in order to respond to changes. Advanced Forensic Format offered a good solution for capturing forensic disk images along with disk image metadata, but Libewf by Joachim Metz, which is a library of tools to access the Expert Witness Compression Format (EWF) has replaced it. They have decided to acquire raw disk images, or when not possible, to use tar files, because the disk images may be less vulnerable to obsolescence.

In attempting to migrate formats, they had to develop methods for migrating the files setup the repository to accept the new files. They also rely on PREMIS metadata.  The migration of disk images from a proprietary or unsupported format to a raw file format has made it easier for us to manage and preserve these objects and mitigates the threat of obsolescence for the near term. There have been some consequences. Some metadata is no longer available. Also, the process will be more complicated and require other workflows, and files will no longer contain embedded metadata. "The migration to a raw file format has made the digital file itself easier to preserve."



Monday, June 27, 2016

A Digital Dark Now? : Digital Information Loss at Three Archives in Sweden

A Digital Dark Now? : Digital Information Loss at Three Archives in Sweden.  Anna-Maria Underhill and  Arrick Underhill. Master’s  thesis. Lund University. 2016. [PDF]
     The purpose of this study is to examine the loss of digital information at three Swedish archives. Digital preservation is a complex issue that most archival institutions struggle with. Focusing on successes to the exclusion of failures runs the risk of creating a blind spot for existing problems.  The definition of digital information in this study includes digital objects and their metadata. The study includes digital internal work documents that serve as a contextual support for an archive’s collections; results are analyzed from the transition between the Records Lifecycle Model and the Records Continuum Model, an ontological understanding of digital information, the SPOT model for risk assessment and the OAIS Reference Model.

Some of the conclusions re-affirm previous research, such as the need to prioritize organizational issues. Others look at the current state of digital preservation at these archives which includes the delicate balancing act "between setting up systems for successful future digital preservation while managing existing digital collections which may not have been preserved correctly". Some institutions are unable to undertake a more proactive form of digital preservation because of the nature of the materials they preserve. The study points out that "when discussing digital preservation, the tendency remains to think of digitized material first rather than born digital information". The loss of a file may be only a part of the loss; there is also a loss of metadata and the connections between information, which may be more common than the loss of entire digital objects. "Finally, one question has followed this study from the beginning to the end: How can you know that you have lost something you never knew existed".
  • When discussing digital  preservation, it is important to clarify that storage is not the same  thing  as  preservation. 
  • The  survival  of  information  is  dependent upon the maintenance of its infrastructure  and migrating it to contemporary  formats. 
  • Authenticity can be a major issue for digital records and is important to their evidentiality.
  • Emulation is another option for digital preservation, which targets the operating environment of the information rather than the file. 
  • Emulation will eventually require migration. Emulation can become too complicated to be viable in the long run
  • Sometimes digital preservation fails to preserve what it intends to save, which can be termed information loss.
  • Obsolescence is currently one of the greatest threats to successful digital preservation. If a file cannot be read, then it is nearly the same thing as a document having been destroyed. 
  • "Without the provenance and the contextual links between records, records cannot be demonstrated to be authentic and reliable, evidentiality is lost and the use of the records for knowledge and understanding about what has happened will be difficult."

One definition of short, medium, and long-term preservation is:
  • Short-term preservation – solutions that are used for a short time, 5 years maximum.
  • Medium-term preservation – solutions that are used during a system’s lifetime, 10 years maximum.
  • Long-term preservation – solutions that are used after the originating system’s lifetime, the number of years varies, usually from 10 to 50 years.
"Dark archives are often used in order to separate the original master copies of a file from the copies that users actually access. These dark archives are generally only accessed when new material is being placed in them, and are otherwise protected in order to maintain the authenticity of the originals by placing them in an environment that is as tamper and error proof as possible"

Six essential properties for digital preservation which must be preserved:
  • Availability
  • Identity
  • Persistence
  • Renderability
  • Understandability
  • Authenticity
The study showed types of actual and potential information loss:
  • Loss of parts or whole digital objects during migration
  • Loss of the connections between analog and digital information belonging to the same archive
  • Loss of information due to it having been saved in an incorrect format
  • Loss of data in connection with technological changes
  • Loss of digital information when stored together with analog
  • Loss of information due to obsolete hardware
  • Loss of metadata due to databases written in code that is not open source
The reasons behind such actual and potential information loss were:
  • Human error during the production of information
  • An analog understanding and treatment of digital information
  • A lack of organizational structure and strategies for digital preservation
  • Lack of resources
  • Technological limitations
  • Lack of competencies among staff who produce digital information

Friday, June 24, 2016

File-format analysis tools for archivists

File-format analysis tools for archivists. Gary McGath. LWN. May 26, 2016.
     Preserving files for the long term is more difficult than just copying them to a drive. There are other issues are involved. "Will the software of the future be able to read the files of today without losing information? If it can, will people be able to tell what those files contain and where they came from?"

Digital data is more problematic than analog materials, since file formats change. Detailed tools can check the quality of digital documents, analyze the files and report problems. Some concerns:

  • Exact format identification: Knowing the MIME type isn't enough.
  • Format durability: Software can fade into obsolescence if there isn't enough interest to keep it updated.
  • Strict validation: Archiving accepts files in order to give them to an audience that doesn't even exist yet. This means it should be conservative in what it accepts.
  • Metadata extraction: A file with a lot of identifying metadata, such as XMP or Exif, is a better candidate for an archive than one with very little. An archive adds a lot of value if it makes rich, searchable metadata available.
Some open-source applications address these concerns, such as:
  • JHOVE (JSTOR-Harvard Object Validation Environment)
  • DROID and PRONOM
  • ExifTool
  • FITS File Information Tool Set
"Identifying formats and characterizing files is a tricky business. Specifications are sometimes ambiguous."  There are different views on how much error, if any, is acceptable. "Being too fussy can ban perfectly usable files from archives."

"Specialists are passionate about the answers, and there often isn't one clearly correct answer. It's not surprising that different tools with different philosophies compete, and that the best approach can be to combine and compare their outputs"


Wednesday, June 22, 2016

Five Star File Format Signature Development

Five Star File Format Signature Development. Ross Spencer. Open Preservation Foundation blog. 14 Jun 2016 .
     Discussion about formats and the importance of developing identification techniques for text formats. DROID is a useful tool but it has its limitations. For those wanting to be involved in defining formats, there are five principles of file format signature development:
  1. Tell the community about your identification gaps
  2. Share sample files
  3. Develop basic signatures
  4. Where feasible, engage openly with the community
  5. Seek supporting evidence
Developing file format signatures is really reverse engineering.

Tuesday, June 21, 2016

Vienna Principles: A Vision for Scholarly Communication

Vienna Principles: A Vision for Scholarly Communication. Peter Kraker, et al. June 2016.
     The twelve principles of Scholarly Communication are:
  1. Accessibility: be immediately and openly accessible by anyone
  2. Discoverability: should facilitate search, exploration and discovery.
  3. Reusability: should enable others to effectively build on top of each other’s work.
  4. Reproducibility: should provide reproducible research results.
  5. Transparency: should provide open means for judging the credibility of a research result.
  6. Understandability: should provide research in an understandable way adjusted to different stakeholders.
  7. Collaboration: should foster collaboration and participation between researchers and their stakeholders.
  8. Quality Assurance: should provide transparent and competent review.
  9. Evaluation: should support fair evaluation.
  10. Validated Progress: should promote both the production of new knowledge and the validation of existing knowledge.
  11. Innovation: should embrace the possibilities of new technology.
  12. Public Good: should expand the knowledge commons.

Monday, June 20, 2016

Preserving Transactional Data

Preserving Transactional Data. Sara Day Thomson. DPC Technology Watch Report 16-02. May 2016.
     This report examines the requirements for preserving transactional data and the challenges in re-using these data for analysis or research.   Transactional will be used to refer to "data that result from single, logical interactions with a database and the ACID properties (Atomicity, Consistency, Isolation, Durability) that support reliable records of interactions."

Transactional data, created through interactions with a database, can come from many sources and different types of information. "Preserving  transactional data, whether large or not, is imperative for the future usability of big data, which is often comprised of many sources of transactional data.  Such data have potential for future developments in consumer analytics and in academic research and "will only lead to new discoveries and insights if they are effectively curated and preserved to ensure appropriate reproducibility."

The organizations who collect transactional data aim to manage and preserve collected data for business purposes as part of their records management. There are strategies for database preservation as well as tools and standards  that can look at data re-use. The strategies for managing and preserving big transactional data must adapt to both SQL and NoSQL environments. Some significant challenges include the large amounts of data, rapidly changing data, and different sources of data creation. 

Some notes:
  • understanding the context and how the data were created may be critical in preserving the meaning behind the data
  • data purpose: preservation planning is critical in order to make preservation actions fit for purpose while keeping preservation cost and complexity to a minimum
  • how data are collected or created can have an impact on long-term preservation, particularly when database systems have multiple entry points, leading to inconsistency and variable data quality.
  • Current technical approaches to preserving transactional data primarily focus on the preservation of databases. 
  • Database preservation may not capture the complexities and rapid changes enabled by new technologies and processing methods 
  • As with all preservation planning, the relevance of a specific approach depends on the organization’s objectives.
There are several approaches to preserving databases:
  • Encapsulation
  • Emulation 
  • Migration/Normalization
  • Archival Data Description Markup Language (ADDML)
  • Standard Data Format for Preservation (SDFP) 
  • Software Independent Archiving of Relational Databases (SIARD)
"Practitioners of database preservation typically prefer simple text formats based on open standards. These include flat files, such as Comma Separated Value (CSV), annotated textual documents, such as Extended Markup Language (XML), and the international and open Structured Query Language (SQL)." The end-goal is to keep data in a transparent and vendor-neutral database so they can be  reintegrated into a future database.

Best practices:
  1. choose the best possible format, either preserving the database in its original format or migrating to an alternative format.
  2. after a database is converted, encapsulate it by adding descriptive, technical, and other relevant documentation to understand the preserved data.
  3. submit database to a preservation environment that will curate it over time.
Research is continuing in the collection, curation, and analysis of data; digital preservation standards and best practices will make the difference between just data and "curated collections of rich information".

Friday, June 17, 2016

The Web’s Past is Not Evenly Distributed

The Web’s Past is Not Evenly Distributed. Ed Summers. Maryland Institute for Technology. May 27, 2016.
     This post discusses ways to structure the content "with the grain of the Web so that it can last (a bit) longer."The web was created so that there was not a central authority to sure all the links work, and permission is not needed to link to a site. It does result in a web where about 5% of links break per year, according to one site.

"The Web dwells in a never-ending present. It is—elementally—ethereal, ephemeral, unstable, and unreliable. Sometimes when you try to visit a Web page what you see is an error message: Page Not Found. This is known as link rot, and it’s a drag, but it’s better than the alternative. Jill Lepore." If we didn’t have a partially broken Web, where content constantly change and links break, it’s quite possible we wouldn’t have a Web at all.  Some things to take note of:
  • problems with naming things
  • redirects
  • proxies
  • web archives
  • static sites
  • data export
"Being able to export your content from one site to the next is extremely important for the long term access to your data. In many ways the central challenge we face in the preservation of Web content, and digital content generally, is the recognition that each system functions as a relay in a chain of systems that make the content accessible."

"Our knowledge of the past has always been mediated by the collective care of those who care to preserve it, and the Web is no different."


Thursday, June 16, 2016

Current Game Preservation is Not Enough

Current Game Preservation is Not Enough. Eric Kaltman. Eric Kaltman's blog. 6 June, 2016.
     The current preservation practices we use for games and software must be reconsidered for modern computer games. The Standard preservation model considers three major areas of interest:
  1. the physical extent of the game, 
  2. the data stored on it, and 
  3. the hardware necessary to run it. 
The long term physical maintenance of games is not particularly good since the media and hardware degrade over time, and the data will not be readable as the media fail. The model also does not reflect the current game technology or the networked world.  "Solutions? What are some ways to combat this looming preservation quagmire? (It’s also not looming, since it’s already here.)"
  1. Consider what we are trying to save when we preserve video games. Is it to save the ability to play a historical game at some point in the future or record the act of play itself.
  2. Get the people creating games to dedicate time to basic preservation activities, such as providing  records of development, production processes, and legacy source code that would help to recreate or recover the games .
  3. There needs to be more pressure and motivation from society to legitimate games as cultural production worth saving, and to create institutional structures to fight for preservation activity. Similar to what is being done for film.
  4. This all applies to more than to just games, but also software in general, which may be in an even worse situation.
The post refers to two YouTube presentations that the author gave on game preservation: 

Wednesday, June 15, 2016

Keep Calm and do Practical Records Preservation

Keep Calm and do Practical Records Preservation. Matthew Addis. Conference on European Electronic Data Management and eHealth Topics. 23 May 2016.
     The presentation looks at some of the practical tools and approaches that can be used to ensure that digital content remains safe, secure, accessible and readable over multiple decades. It covers  mostly "practical and simple steps towards doing digital preservation for electronic content" but also some ways to determine how well prepared you are for preservation.  Some things you need to show:
  • ongoing integrity and authenticity of content in an auditable way.
  • that content is secured and access is controlled.
  • ability to access content when needed that is readable and understandable.
  • ability to do this over decades, which is a very long time in the IT world 
  • have an archivist with clear responsibility for making all this happen
  • have appropriate processes that manage all the risks proportionally.
A really simple definition of Digital Preservation from the Library of Congress: "the management of content in a pro-active way so that it remains accessible and usable over time." 

"Focus on the basic steps that need to be done now in order to support something bigger and better in the future." Know what you have and get the precious stuff in a safe storage environment.


Tuesday, June 14, 2016

Digital Preservation: We have to get this right

"We have to get this right." Jennifer Paustenbaugh. Digital Preservation. Harold B. Lee Library, Brigham Young University. June, 2016.
     Here are some recent email comments from Jennifer Paustenbaugh, our University Librarian, on digital preservation:
  • “We have to get this right. If we don't, then not much else that we’re doing in research libraries matters. If we don’t fully develop a sustainable digital preservation program, we could negatively impact whole areas of research, because materials created right now could just disappear. I think about gaps that exist in records because of man-made events and natural disasters. This could be a disaster of our own making.” 
  • "I truly believe that of all the things we’re doing in the library, this is the thing that has the potential to make the biggest difference to scholars 20 or 50 years from now. Much of the digital content that we are preserving will be gone forever if we don’t do this right. It’s a role that at once is formidable and humbling. And for most people, it will probably never be important until something that is vital to their research is just missing (and forever unavailable) from the historical record."

Monday, June 13, 2016

Macro & Micro Digital Preservation Services & Tools

Rosetta Users Group 2016: Macro & Micro Digital Preservation Services & Tools. Chris Erickson. June 7, 2016. [PDF slides]
      This is my presentation at the Rosetta's User Group / Advisory Group held this past week in New York (I always enjoy these meetings; probably my favorite conference).
  • Preservation Micro Services: free-standing applications that perform a single or limited number of tasks in the larger preservation process. The presentation includes some of those we use in our processes, both from internet sites and those that we have created in-house. Micro services are often used in the following processes: 
    • Capture
    • Appraisal
    • Processing
    • Description
    • Preservation
    • Access
  • Preservation Macro Services: Institutional services and directions that assist organizations in identifying and implementing a combination of policies, strategies, and tactics to effectively meet their preservation needs. Some of these are:
    • Digital Preservation Policy Framework
    • Workflows
    • Storage plans
    • Financial Commitment and
    • Engaging the Community
"Practitioners can only make use of individual micro-services tools if they understand which roles they play in the larger digital curation and preservation process...." Richard Fyffe 

“We have to get this right. If we don't, then not much else that we’re doing in research libraries matters. If we don’t fully develop a sustainable digital preservation program, we could negatively impact whole areas of research, because materials created right now could just disappear. I think about gaps that exist in records because of man-made events and natural disasters. This could be a disaster of our own making.” Jennifer Paustenbaugh. University Librarian. 

Since starting in this position in 2002, our digital preservation challenges have changed and increased. Re-evaluating where we are heading and how we proceed is important. A combination of broad visions and practical applications can ensure the future use of digital assets

Thursday, June 02, 2016

The Three C’s of Digital Preservation: Contact, Context, Collaboration

The Three C’s of Digital Preservation: Contact, Context, Collaboration. Brittany. DigHist Blog. May 5, 2016.
     The post looks at three themes from learning about digital preservation: "every contact leaves a trace, context is crucial, and collaboration is the key".

Contact: A digital object is more than we see, and we need to take into consideration the hardware, software, code, and everything that runs underneath it. There are "layers and layers of platforms on top of platforms for any given digital object", the software, the browser, the operating system and others. These layers or platforms are constantly obsolescing or changing and "cannot be relied upon to preserve the digital objects.  Especially since most platforms are proprietary and able to disappear in an instant."

Context is Crucial: "There’s no use in saving everything about a digital object if we don’t have any context to go with it." Capture the human experience with the digital objects. 

Collaboration is the Key: "There are a number of roles played by different people in digital preservation, and these roles are conflating and overlapping." As funding becomes tighter and the digital world more complex, "collaboration is going to become essential for a lot of digital preservation projects".   

There are still many unanswered questions that need to be asked and answered.


Tuesday, May 31, 2016

Introduction to Free and/or Open Source Tools for Digital Preservation

Introduction to Free and/or Open Source Tools for Digital Preservation. Max Eckard. University of Michigan. May 16, 2016.
     The post refers to a workshop that was given as part of the Personal Digital Archiving 2016 conference, entitled "Introduction to Free and/or Open Source Tools for Digital Preservation". This workshop introduced participants to a mix of open source and/or free software to review personal digital archives and "perform preservation actions on content to ensure its long-term authenticity, integrity, accessibility, and security". The presentation slides and google doc file are available and contain all the links and additional information.

The table of contents:
      Still Images
      Text(ual) Content
      Audio and Video
      MD5summer


Monday, May 30, 2016

US nuclear force still uses floppy disks

US nuclear force still uses floppy disks. BBC News Services. 26 May 2016.
     A government report shows that the US nuclear weapons forces ( intercontinental ballistic missiles, nuclear bombers and tanker support aircraft) still use a 1970s-era computer system and 8-inch floppy disks. The GAO said there were "legacy systems" which need to be replaced. Legacy systems cost about $61bn a year to maintain. "This system remains in use because, in short, it still works."  "However, to address obsolescence concerns, the floppy drives are scheduled to be replaced with secure digital devices by the end of 2017." According to the report, the US treasury systems, still use a system written in "assembly language code".


Saturday, May 28, 2016

List of analog media inspection templates/forms

List of analog media inspection templates/forms. Katherine Nagels, et al. May 6, 2016.
     This  is a list of freely available analog media inspection templates, forms, or reports. Anyone is free to add contributions.  Appropriate additions may include:
  • inspection reports/forms/templates
  • condition reports/forms/templates
  • instructional guides for inspecting or assessing the condition of analog film, audio, or video

TEMPLATES/FORMS

IDENTIFICATION RESOURCES


Thursday, May 26, 2016

The Governance of Long-Term Digital Information

The Governance of Long-Term Digital Information. IGI 2016 Benchmark. Information Governance Initiative. May 18, 2016. [PDF]
     “The critical role of digital . . .archives in ensuring the future accessibility of information with enduring value has taken a back seat to enhancing access to current and actively used materials. As a consequence, digital preservation remains largely experimental and replete with the risks . . . representing a time bomb that threatens the long-term viability of [digital archives].”

1. We have a problem. Nearly every organization has digital information they want to keep for 10 or more years.
2. The problem is technological, most often a storage problem.
3. The problem is business related. It is not related to just archives, libraries or museums. 
4. The problem is a legal problem. Legal requirements are the main reason organizations keep 
digital information longer than ten years
5. We know what we must do, but are we doing it? In a survey 97 percent said they are aware that digital information is at risk of obsolescence but three fourths are just thinking about it or have no strategy. Only 16% have a standards-based digital preservation system.
  • “Most records today are born digital."
  • Digital assets should be considered business-critical information and steps taken to keep them usable long into the future
  • Most organizations are not storing their long-term digital assets in a manner sufficient to ensure their long-term protection and accessibility.
     
How are they being kept? According to a survey:
  • Shared Network Drive                                68%
  • Business Applications (e.g. CRM, ERP)        52%
  • Enterprise Content Management System     47%
  • Disk or Tape Backup Systems                      44%
  • Records Management System                      43%
  • Application-specific Archiving (e.g. email)  33%
  • Removable Media (e.g. CD or USB)              22%
  • Enterprise Archiving System                       14%
  • Long-term Digital Preservation System        11%
  • Other                                                          9%
  • Commodity Cloud Storage (e.g. Amazon)      8%
  • I don't know                                                 1%

Where to start? Some recommendations:
  • Triage right now the materials that are in serious danger of being lost, damaged, or rendered inaccessible.
  • Conduct a formal assessment so that you can benefit from strategic planning and economies of scale.
  • Address the Past, Protect the Future
  • Catalog the Consequences of not being able to access and rely upon your own information
  • Build Your Rules for Protection and accessibility
  • Assess the IT Environment

Thursday, May 19, 2016

One Billion Drive Hours and Counting: Q1 2016 Hard Drive Stats

One Billion Drive Hours and Counting: Q1 2016 Hard Drive Stats. Andy Klein. Backblaze. May 17, 2016.
     Backblaze reports statistics for the first quarter of 2016 on 61,590 operational hard drives used to store encrypted customer data in our data center. The hard drives in the data center, past and present, totaled over one billion hours in operation to date.The data in these hard drive reports has been collected since April 10, 2013. The website shows the statistical reports of the drive operations and failures every year since then. The report shows the drives (and drive models) by various manufacturers, the number in service, the time in service, and failure rates. The drives in the data center come from four manufacturers, most of them are from HGST and Seagate. Notes:
  • The overall Annual Failure Rate of 1.84% is the lowest quarterly number we’ve ever seen.
  • The Seagate 4TB drive leads in “hours in service” 
  • The early HGST drives, especially the 2- and 3TB drives, have lasted a long time and have provided excellent service over the past several years.
  • HGST has the most hours in service

Related posts:

IBM Scientists Achieve Storage Memory Breakthrough

IBM Scientists Achieve Storage Memory Breakthrough. Press release. 17 May 2016.
     IBM Research demonstrated reliably storing 3 bits of data per cell using phase-change memory. This technology doesn't lose data when powered off and can endure at least 10 million write cycles, compared to 3,000 write cycles for an average flash USB stick. This provides "fast and easy storage" to capture the exponential growth of data.


Wednesday, May 18, 2016

Floppy Disk Format Identifer Tool

Floppy Disk Format Identifer Tool. Euan Cochrane. Digital Continuity Blog. May 13, 2016.
     Euan created this tool https://github.com/euanc/DiskFormatID (which he documents in this great blog post) to:
  1.     “Automatically” identify floppy disk formats from kryoflux stream files.
  2.     Enable “simple” disk imaging workflows that don’t include a disk format identification step during the data capture process.
The tool processes copies of floppy disk data saved in the kryoflux stream file format, creates a set of disk image files formatted according to assumptions about the disk’s format, and allows the user to try mounting the image files as file systems. It requires the Kryoflux program to function. The documentation also provides detailed information on how to use it, along with other interesting information.

Thursday, May 12, 2016

The Center for Jewish History Adopts Rosetta for Digital Preservation and Asset Management

The Center for Jewish History Adopts Rosetta for Digital Preservation and Asset Management. Ex Libris. Press Release. May 12, 2016.
     After a thorough search process, the Center for Jewish History selected the Ex Libris Rosetta digital asset management and preservation solution. They wanted a system to handle their comprehensive list of requirements for both long‑term digital preservation and robust management of digital assets, including the ability to interface with their other systems.

The Center’s partners are American Jewish Historical Society, American Sephardi Federation, Leo Baeck Institute, Yeshiva University Museum, and YIVO Institute for Jewish Research.  The collections include more than five miles of archival documents, over 500,000 volumes, and thousands of artworks, textiles, ritual objects, recordings, films, and photographs.