Online Sharing of Digital Preservation Problems and Solutions. bram van der werf. Open Planets Foundation. 27 October 2011.
The vision of the Open Planets Foundation from the beginning is that meaningful discussions around actual Digital Preservation problems are key to finding solutions, tools and ultimately improving digital preservation practice. It encourages people to document issues and search for practical solutions to the problems.
The online digital preservation community needs to be driven by content issues, practical solutions and above all meaningful discussions between all stakeholders. Digital Preservation might be a technology challenge but it is definitely not for the sake technology, it is about keeping digital content safe and accessible. The technology serves the digital content users and curators. The digital preservation practitioners for small to medium institutions where people have multiple roles are clearly the collection owner. For larger institutions, this becomes more of a distributed role among those with different technical responsibilities. By involving all the groups everyone can benefit from the larger organizations and can often take actions on today’s problems as experienced by practitioners of medium/small institutes.
This blog contains information related to digital preservation, long term access, digital archiving, digital curation, institutional repositories, and digital or electronic records management. These are my notes on what I have read or been working on. Please note: this does not reflect the views of my employer or anyone else.
Friday, May 25, 2012
Saturday, May 19, 2012
Metadata Clean Sweep: A Digital Library Audit Project
Metadata Clean Sweep: A Digital Library Audit Project. R. Niccole Westbrook, et al.
As digital library collections grow in size, metadata issues such as inconsistencies, incompleteness and quality become increasingly difficult to manage over time. This was a project by library school interns and full-time staff to alleviate poor recall, poor precision and metadata inconsistencies across digital collections. Access to content is the primary goal of the metadata of a digital library. Successful metadata maintenance must include both correction of found mistakes and correction to practices that lead to those mistakes.
As digital library collections grow in size, metadata issues such as inconsistencies, incompleteness and quality become increasingly difficult to manage over time. This was a project by library school interns and full-time staff to alleviate poor recall, poor precision and metadata inconsistencies across digital collections. Access to content is the primary goal of the metadata of a digital library. Successful metadata maintenance must include both correction of found mistakes and correction to practices that lead to those mistakes.
A simple JP2 file structure checker.
A simple JP2 file structure checker. Johan van der Knijff. Open Planets. 1 September 2011.
One major risk of file migration is that hardware failures during the migration process may result in corrupted images. Ideally JHOVE would be able to detect such errors using format validation tools. "I started out with removing some trailing bytes from a lossily compressed JP2 image. As it turned out, I could remove most of the image code stream (reducing the original 2 MB image to a mere 4 kilobytes!), but JHOVE would still say the file was "well-formed and valid". I was also able to open and render these files with viewer applications such as Adobe Photoshop, Kakadu's viewer and Irfanview. The behaviour of the viewer apps isn't really a surprise, since the ability to render an image without having to load the entire code stream is actually one of the features that make JPEG 2000 so interesting for many access applications. JHOVE's behaviour was a bit more surprising, and perhaps slightly worrying." So he wrote a very simple JP2 file structure checker that is available for download.
One major risk of file migration is that hardware failures during the migration process may result in corrupted images. Ideally JHOVE would be able to detect such errors using format validation tools. "I started out with removing some trailing bytes from a lossily compressed JP2 image. As it turned out, I could remove most of the image code stream (reducing the original 2 MB image to a mere 4 kilobytes!), but JHOVE would still say the file was "well-formed and valid". I was also able to open and render these files with viewer applications such as Adobe Photoshop, Kakadu's viewer and Irfanview. The behaviour of the viewer apps isn't really a surprise, since the ability to render an image without having to load the entire code stream is actually one of the features that make JPEG 2000 so interesting for many access applications. JHOVE's behaviour was a bit more surprising, and perhaps slightly worrying." So he wrote a very simple JP2 file structure checker that is available for download.
Friday, May 18, 2012
The CLIF Project: The Repository as Part of a Content Lifecycle
The CLIF Project: The Repository as Part of a Content Lifecycle. Richard Green, Chris Awre, Simon Waddington. Ariadne. 9 March 2012.
This was a joint project that did an extensive literature review and worked with digital content creators to understand how to deal with the interaction of the authoring, collaboration and delivery of materials. At the heart of meeting institutional requirements for managing digital content is the need to understand the different operations through which content goes, from planning and creation through to disposal or preservation. Repositories must be integrated with the other systems that support other parts of this lifecycle to prevent them becoming yet another information silo within the institution.
The CLIF software has been designed to try and allow the maximum flexibility in how and when users can transfer material from one system to another, integrating the tools in such a way that they seem to be natural extensions of the basic systems. This open source software is available for others to investigate and use.
The repository’s archival capability is regarded as one of its strongest assets, and the role of the repository within a University will be regarded very much in terms of what it can offer that other campus systems cannot. It should not try to compete on all levels. There is a need to clarify better at an institutional level what functionality is offered by different content management systems, in order to better understand how different stages of the digital content lifecycle can be best enabled.
This was a joint project that did an extensive literature review and worked with digital content creators to understand how to deal with the interaction of the authoring, collaboration and delivery of materials. At the heart of meeting institutional requirements for managing digital content is the need to understand the different operations through which content goes, from planning and creation through to disposal or preservation. Repositories must be integrated with the other systems that support other parts of this lifecycle to prevent them becoming yet another information silo within the institution.
The CLIF software has been designed to try and allow the maximum flexibility in how and when users can transfer material from one system to another, integrating the tools in such a way that they seem to be natural extensions of the basic systems. This open source software is available for others to investigate and use.
The repository’s archival capability is regarded as one of its strongest assets, and the role of the repository within a University will be regarded very much in terms of what it can offer that other campus systems cannot. It should not try to compete on all levels. There is a need to clarify better at an institutional level what functionality is offered by different content management systems, in order to better understand how different stages of the digital content lifecycle can be best enabled.
Pogoplug service turns your computers into private cloud
Pogoplug service turns your computers into private cloud. Lucas Mearian. Computerworld. May 9, 2012.
Pogoplug Team which allows home offices and small businesses (best for 10 - 50 people) to turn file servers or spare PCs into pools of storage accessible from the Web. This basically creates a private cloud for businesses so they can store information within their own firewalls rather than on third-party services. The product allows multiple users to share storage in a home or office. The service's capabilities are similar to Dropbox for file sharing and collaboration. It has full access and file sharing to 3TB of storage for $150 (not including the $15-a-year, per-person licensing fee paid to Pogoplug for its service.)
Pogoplug Team which allows home offices and small businesses (best for 10 - 50 people) to turn file servers or spare PCs into pools of storage accessible from the Web. This basically creates a private cloud for businesses so they can store information within their own firewalls rather than on third-party services. The product allows multiple users to share storage in a home or office. The service's capabilities are similar to Dropbox for file sharing and collaboration. It has full access and file sharing to 3TB of storage for $150 (not including the $15-a-year, per-person licensing fee paid to Pogoplug for its service.)
Thursday, May 17, 2012
A prototype JP2 validator and properties extractor.
A prototype JP2 validator and properties extractor. Johan van der Knijff. The Open Planets Foundation. 14 December 2011, Update March 2012.
The site includes a discussion of the full-fledged JP2 validator tool, as well as how it validates the object. A prototype is now ready in the form of the jpylyzer tool. The tool is both a validator and a properties extractor.
Wednesday, May 16, 2012
Bit Rot & Long Term Access.
Bit Rot & Long Term Access. bram van der werf. Open Planets. 28 February 2011.
Discussion of bit rot in an image collection and the call to examine more closely preservation and long term access. There is a Problem, Requirement, Use Case sequence in the "what we need" section of the OPF wiki.
"We need to make serious efforts to drill down into every individual potential element and risk for Bit Rot. Create Use Cases and take action on these Use Cases and develop practical tools and/or workflow to assure we can prevent Bit Rot." These may be more a case of creating scripts that can compare "checksums" or "open and save as new" type of activities inside a repository. Computer hardware is "subject to a very short technical life cycle and by far not as reliable for keeping the integrity of the bits as we sometimes like to believe."
It is crucial to understand the challenge and for experts to open a continuous discussion.
Discussion of bit rot in an image collection and the call to examine more closely preservation and long term access. There is a Problem, Requirement, Use Case sequence in the "what we need" section of the OPF wiki.
"We need to make serious efforts to drill down into every individual potential element and risk for Bit Rot. Create Use Cases and take action on these Use Cases and develop practical tools and/or workflow to assure we can prevent Bit Rot." These may be more a case of creating scripts that can compare "checksums" or "open and save as new" type of activities inside a repository. Computer hardware is "subject to a very short technical life cycle and by far not as reliable for keeping the integrity of the bits as we sometimes like to believe."
It is crucial to understand the challenge and for experts to open a continuous discussion.
BitCurator: Tools and Techniques for Digital Forensics in Collecting Institutions.
BitCurator: Tools and Techniques for Digital Forensics inCollecting Institutions. Christopher A. Lee, et al. D-Lib Magazine. May/June 2012.
The BitCurator Project aims to incorporate digital forensics tools and methods into collecting institutions' workflows for libraries, archives and museums (LAMs). There is an increasing need to move born-digital materials from removable media into more sustainable preservation environments. The project criteria for the multi-year project will include the following:
- The forensic technologies should be open source, extensible, and mature.
- Development will focus on extensions, plugins, and wrappers for proven software rather than from-scratch development.
- It will adhere to a common digital forensic metadata standard and provide crosswalks to relevant library and archives metadata schemas.
Implementing DOIs for Research Data.
Implementing DOIs for Research Data. Natasha Simons. D-Lib Magazine. May/June 2012.
As research becomes more collaborative and global it is also becoming more difficult to manage the large amounts of research data generated daily. The Digital Object Identifier (DOI) system is one way to create persistent identifiers for research data collections and datasets. "Data that is richly described, organised, integrated and connected allows the data to be more easily discovered by other researchers." Identifying such resources allow research data collections and datasets be open and discoverable to others, but there are questions that need to be answered, such as the type of material to get a persistent id, the granularity, whether the landing page or the resource should get the id, who creates and maintains the ids, and for how long. The questions, common to other institutions, should encourage discussion and collaboration.
As research becomes more collaborative and global it is also becoming more difficult to manage the large amounts of research data generated daily. The Digital Object Identifier (DOI) system is one way to create persistent identifiers for research data collections and datasets. "Data that is richly described, organised, integrated and connected allows the data to be more easily discovered by other researchers." Identifying such resources allow research data collections and datasets be open and discoverable to others, but there are questions that need to be answered, such as the type of material to get a persistent id, the granularity, whether the landing page or the resource should get the id, who creates and maintains the ids, and for how long. The questions, common to other institutions, should encourage discussion and collaboration.
Preserving Email DPC Technology Watch Report released.
Preserving Email DPC Technology Watch Report released. Neil Beagrie. Blog. News release. 17 Feb 2012.
The 57 page Preserving Email technology watch report gives practical advice on how to ensure that email remains accessible. "Users normally shoulder the ultimate responsibility for managing and preserving their own email. This exposes important records to needless risks and is counterproductive in many cases." The three basic steps which institutions should undertake:
Unless we make the preservation of trusted email records a systematic part of our everyday operations, many important records will be lost. They cite some examples, such as the 22 million emails from the Executive Office of the President of the United States surrounding the Gulf War that we lost when servers were replaced.
The 57 page Preserving Email technology watch report gives practical advice on how to ensure that email remains accessible. "Users normally shoulder the ultimate responsibility for managing and preserving their own email. This exposes important records to needless risks and is counterproductive in many cases." The three basic steps which institutions should undertake:
- Define policies, including institutional commitment, specific actions to take, and end-user expectations, responsibilities and rights regarding the email archives
- choosing appropriate tools to work with email in an environment where users have adequate storage space and without auto-deletion settings
- implementing them in the light of local environmental factors and available resources.
- The ‘sweeping up crumbs’ or whole-account approach refers to harvesting email found on a user’s computer or account.
- The ‘nurturing and harvesting’ approach, helping email users ensure that critical records are retained in system-neutral formats, then using email migration software to capture and preserve records either as they are created or at the end of a user’s lifetime. An example is providing users a designated ‘archives’ box.
- The ‘capturing carbon’ or whole system approach implements email archiving software to capture an entire email ecosystem or a portion of that ecosystem to an external email storage environment.
- The ‘tagging and bagging’ or message-by-message approach, existing electronic record management systems, but which may not be effective.
- The Personal Archives Service approach, which would offer services such as Carbonite or CrashPlan.
Unless we make the preservation of trusted email records a systematic part of our everyday operations, many important records will be lost. They cite some examples, such as the 22 million emails from the Executive Office of the President of the United States surrounding the Gulf War that we lost when servers were replaced.
Lessons From a (Former?) Digital Preservation Neophyte.
Lessons
From a (Former?) Digital Preservation Neophyte. Chris Prom. Website. April
10, 2012.
- Render objects
- Retain evidence
- Show authenticity
We
do not need to understand it all at once, nor implement it all at the same time
- Get your own house in order
- Begin helping others
- Develop a digital program statement
- Acquire and deal with digital born stuff
- Begin to implement a TDR
General
Process:
- Understand your limitations
- Understand the format and structure of what you are trying to preserve
- Match tools to desired services
- Be consistent
- Be willing to experiment (and fail)
- Provide something other people need
- The process of learning an art can be divided conveniently into two parts: one, the mastery of the theory the other, the mastery of the practice. Erich Fromm
- Thought can only lead us to the knowledge but it cannot give us the ultimate answer. The only way in which the world can ultimately be grasped lies not in thought, but in the act.”
Thursday, May 10, 2012
The Digital Preservation Network.
The Digital Preservation Network. George Eberhart. American Libraries Magazine. May 4, 2012.
The ARL membership meeting discussed the recently established Digital Preservation Network, a federation of universities intent on securing the long-term preservation of the digital scholarly record. James Hilton said that only universities, not private industry, can solve the problem of preserving born-digital data and making it accessible to future generations. “Universities last for centuries. Companies do not.”
Other notes from the report:
The ARL membership meeting discussed the recently established Digital Preservation Network, a federation of universities intent on securing the long-term preservation of the digital scholarly record. James Hilton said that only universities, not private industry, can solve the problem of preserving born-digital data and making it accessible to future generations. “Universities last for centuries. Companies do not.”
Other notes from the report:
- The problem is too large for just libraries to handle. The presidents of universities must be on board with this.
- The data and metadata of research and scholarship are susceptible to multiple single points of technical, political, and funding failure.
- The program asks for a $20,000 initial commitment to help fund project software.
Digital Preservation Tools.
Digital Preservation Tools. Paul Wheatley. SPRUCE Project Wiki. May 10, 2012.
Open Planets Foundation Digital Preservation Tool Registry. This lists many tools of all types that may be just what you are looking for, and also references to additional tool lists, such as http://agogified.com/tools-and-services. A few of the categories include:
Open Planets Foundation Digital Preservation Tool Registry. This lists many tools of all types that may be just what you are looking for, and also references to additional tool lists, such as http://agogified.com/tools-and-services. A few of the categories include:
- Transcription Tools
- DataOne
- Forensic tools
- PDF Creation
- File Conversion tools
- Audio file conversion tools
- Web capture
- Email mining tools
- Record analysis tools
- Curation and data management tools
- Bulk extractor tools
- etc.
Friday, May 04, 2012
Digital Preservation: Question and Response
Question: We are trying to move away from multiple physical copies of our archival audio. Our goal is to just have redundant digital copies and one physical copy on some medium in case of catastrophic failure with the servers. What physical medium other audio archives are using for preservation/backup copies of digital audio files.
Response: It is important to have redundant digital copies in case of failure; we have had a number of occasions where we needed to recover objects from a second or third copy when other copies failed. There are different ways to accomplish multiple digital copies, such as copies on server, tape (LTO), cloud (DuraCloud and others), and optical discs, or a mixture. There are advantages and disadvantages with each option; we have a mixture of these, but are moving towards two options, server copies with tape backup, and M-DISCs. We have started using the Rosetta software to manage the server copies, which will have tape copies stored in a secure off site location. But I want an additional copy on another medium, which is our M-DISC archive.
In general optical CDs and DVDs have a short lifespan. For a number of years, I have been caring for a large disc archive of gold CDs and DVDs; there are multiple copies for redundancy. (There are tape copies also, which have had problems as well, but the gold archival discs have been considered the preservation copies.) Multiple copies are necessary since a percentage of the discs fail each year (the percentage depends on the collection). In order to solve that problem, two professors on campus (Information Technology, and Chemistry & Material Science) created a digital storage medium that does not fail over time and is unaffected by any normal factors, such as light, heat, cold, oxidation, pollutants, material decay, bit flips, etc. (Extreme stress tests by the US military could not get the discs to fail.)
The university licensed the technology to a company called Millenniata which has partnered with LG and others to produce the M-DISC. Currently the M-DISC is a DVD format, which is somewhat of a drawback if you want to store large archives, but they are developing other densities and options. (There are some organizations using a multi-terabyte Millenniata device, but I have not seen it.) I have tested and used the M-DISCs for several years and have not had any problems. (I plan to start another round of testing on my M-DISC archive this year to check on the status of the discs; I check them for usability, read error levels, and bit transfer integrity.) So I consider this my ‘copy in case of catastrophic failure’, while the Rosetta archive is my ‘active preservation archive’. There are others in the library and on campus (digital lab, records management, etc) that use the M-DISCs for their own long term copy for images, documents, audio, video, and such. Whatever you choose, you should think about multiple copies in multiple places on multiple media. What you choose needs to fit your circumstances and be sustainable for your program.
Response: It is important to have redundant digital copies in case of failure; we have had a number of occasions where we needed to recover objects from a second or third copy when other copies failed. There are different ways to accomplish multiple digital copies, such as copies on server, tape (LTO), cloud (DuraCloud and others), and optical discs, or a mixture. There are advantages and disadvantages with each option; we have a mixture of these, but are moving towards two options, server copies with tape backup, and M-DISCs. We have started using the Rosetta software to manage the server copies, which will have tape copies stored in a secure off site location. But I want an additional copy on another medium, which is our M-DISC archive.
In general optical CDs and DVDs have a short lifespan. For a number of years, I have been caring for a large disc archive of gold CDs and DVDs; there are multiple copies for redundancy. (There are tape copies also, which have had problems as well, but the gold archival discs have been considered the preservation copies.) Multiple copies are necessary since a percentage of the discs fail each year (the percentage depends on the collection). In order to solve that problem, two professors on campus (Information Technology, and Chemistry & Material Science) created a digital storage medium that does not fail over time and is unaffected by any normal factors, such as light, heat, cold, oxidation, pollutants, material decay, bit flips, etc. (Extreme stress tests by the US military could not get the discs to fail.)
The university licensed the technology to a company called Millenniata which has partnered with LG and others to produce the M-DISC. Currently the M-DISC is a DVD format, which is somewhat of a drawback if you want to store large archives, but they are developing other densities and options. (There are some organizations using a multi-terabyte Millenniata device, but I have not seen it.) I have tested and used the M-DISCs for several years and have not had any problems. (I plan to start another round of testing on my M-DISC archive this year to check on the status of the discs; I check them for usability, read error levels, and bit transfer integrity.) So I consider this my ‘copy in case of catastrophic failure’, while the Rosetta archive is my ‘active preservation archive’. There are others in the library and on campus (digital lab, records management, etc) that use the M-DISCs for their own long term copy for images, documents, audio, video, and such. Whatever you choose, you should think about multiple copies in multiple places on multiple media. What you choose needs to fit your circumstances and be sustainable for your program.
Applying to NEH's Preservation Assistance Grants - Webinar.
Applying to NEH's Preservation Assistance Grants - Webinar. Elizabeth Joffrion. National Endowment for the Humanities.March 12, 2012.
An excellent webinar for the Connecting to Collections community on the NEH's Preservation Assistance Grant, what they are, how to apply for them, and examples of some projects that have been funded, including BYU's grant for a digital preservation workshop. Includes an introduction to the grants; who qualifies as a smaller institution; proposal development and grant writing strategies.They encourage institutions that have never applied before. The PAG grants are easy to write and there is no sharing cost.
An excellent webinar for the Connecting to Collections community on the NEH's Preservation Assistance Grant, what they are, how to apply for them, and examples of some projects that have been funded, including BYU's grant for a digital preservation workshop. Includes an introduction to the grants; who qualifies as a smaller institution; proposal development and grant writing strategies.They encourage institutions that have never applied before. The PAG grants are easy to write and there is no sharing cost.
Library of Congress Digital Preservation Newsletter.
Library of Congress Digital Preservation Newsletter. May 2012. [PDF]
Items from the Newsletter include:
Items from the Newsletter include:
- Key outcomes of the NDIIPP program are to identify priorities for born digital collections and engage organizations committed to preserving digital content.
- Viewshare is being used for the collections
- Floppy Disks are Dead, Long Live Floppy Disks
- Floppy disks are fragile constructions that were never designed for permanence.
- Difficult to determine what is on the floppy and to recover
- A floppy disk controller called Catweasel allows computers to access a wide variety of older disk formats (must have the floppy drive).
- Web archiving.
- Because of the scope of the web sites, consider partnering with other institutions.
- Preservation of and Access to Federally Funded Scientific Data
- Research data produced by federally funded scientific projects should be freely available to the wider research community and the public
- Public data should be a public resource, and data sharing supports core scientific values like openness, transparency, and replication.
- Lack of resources for curating scientific data and a lingering tradition of data hoarding create resistance to open access to research data.
Guidelines on the Production and Preservation of Digital Audio.
Guidelines on the Production and Preservation of Digital Audio. The International Association of Sound and Audiovisual Archives. ed. by Kevin Bradley. Second edition 2009.
A web version of guidelines on creating and preserving digital audio objects. Areas of interest covered include:
A web version of guidelines on creating and preserving digital audio objects. Areas of interest covered include:
- Metadata, including preservation metadata
- Important to keep all aspects of preservation and transfer relating to audio files, including all technical parameters
- Persistent Identifiers
- Naming conventions
- Digital preservation planning
- OAIS: the services and functions for monitoring the environment and providing recommendations to ensure that the information stored remains accessible to the Designated User Community over the long term, even if the original computing environment becomes obsolete
- Preservation planning is the process of knowing the technical issues in the repository, identifying the future preservation direction (pathways), and determining when a preservation action, such as format migration, will need to be made.
- Small scale digital archiving
- Digital preservation is as much an economic issue as a technical one.
- Any proposal to build and manage an archive of digital audio objects should have a strategy which includes plans for the funding of ongoing maintenance and replacement
- Risks
- The only way to know the condition of a digital collection is constant and comprehensive testing.
- A preservation strategy requires a listing of the risks associated with the loss of technical expertise and how that will be addressed.
- An archive can distribute risk in a number of ways, such as:
- form local partnerships so that content is distributed
- establish a relationship with a stable well funded archive;
- engage a commercial supplier of storage services
- Recording formats
- It is not recommended that audio streams be recorded for long term storage.
- Recommend recording as a digital data file
- Recommend recording as .wav or preferably BWF .wav files
- The large investment in a single format will also help support the continuance of that format for the longest period, as the industry will not change an entrenched format without significant benefits.
Wednesday, May 02, 2012
New DuraCloud Digital Archiving and Preservation Services.
New DuraCloud Digital Archiving and Preservation Services. Press Release. Duraspace. May 1, 2012.
The DuraSpace organization announced new DuraCloud subscription plans that offer three levels digital preservation and archiving services in the cloud. Prices for the new subscription plans are competitive with commercial cloud providers and do not require additional transfer or variable costs.
• Copies of the content stored with multiple providers
• Automated health checking of content
• Automated repair of damaged files for Preservation Plus customers
• A full suite of reports
• Online sharing and streaming to any internet-linked device
The DuraSpace organization announced new DuraCloud subscription plans that offer three levels digital preservation and archiving services in the cloud. Prices for the new subscription plans are competitive with commercial cloud providers and do not require additional transfer or variable costs.
- DuraCloud Preservation Basic: for institutions that would like to have two redundant copies of their original content stored at one cloud data center.
- DuraCloud Preservation Plus: for institutions that wish to have four redundant copies of their original content stored at two cloud data centers.
- DuraCloud Enterprise: a full suite of configurable DuraCloud features for institutions that need multiple DuraCloud sub-accounts for departments, research groups, cross institutional projects, or individuals.
• Copies of the content stored with multiple providers
• Automated health checking of content
• Automated repair of damaged files for Preservation Plus customers
• A full suite of reports
• Online sharing and streaming to any internet-linked device
Tuesday, May 01, 2012
Preserving Moving Pictures and Sound.
Preserving Moving Pictures and Sound. Richard Wright. DPC Technology Watch Report 12-01. March 2012. [PDF]
This excellent report is for anyone with responsibility for collections of sound or moving image content and an interest in preservation of that content. For audiovisual materials, digitization is critical to the survival of the content because of the obsolescence of playback equipment and decay and damage of physical items, whether analogue or digital.
The basic technology issue for audio/visual content is to digitize all items on the shelves, either for preservation or access. The risk of loss is high. Another issue is moving content from the current media to digital files. A third issue is preserving the digital files. This report describes the techniques for preservation planning, digitization and digital preservation of audiovisual content, and describes the technologies. Preservation of these materials is difficult because they are physically, culturally, and economically different.
Explanation of signals and carriers. "Digital technology produces recordings that are independent of carriers. Carrier independence is liberation". Digital preservation of the digitized signal means to preserve the numbers, but also the technology to decode the numbers. ‘Maximum integrity’ means keeping the full quality of the audio and video. As far as possible, the new preservation copy should be an exact replica of the original: the content should not be modified in any way’. This may be difficult to achieve.
The two basic kinds of preservation action are: 1) changing the audiovisual content within a collection, or normalization; 2) changing the system that holds the collection.
There are four main factors in an analogue or digital conservation program:
The four PrestoPRIME requirements for effective access to time-based media are:
This excellent report is for anyone with responsibility for collections of sound or moving image content and an interest in preservation of that content. For audiovisual materials, digitization is critical to the survival of the content because of the obsolescence of playback equipment and decay and damage of physical items, whether analogue or digital.
The basic technology issue for audio/visual content is to digitize all items on the shelves, either for preservation or access. The risk of loss is high. Another issue is moving content from the current media to digital files. A third issue is preserving the digital files. This report describes the techniques for preservation planning, digitization and digital preservation of audiovisual content, and describes the technologies. Preservation of these materials is difficult because they are physically, culturally, and economically different.
Explanation of signals and carriers. "Digital technology produces recordings that are independent of carriers. Carrier independence is liberation". Digital preservation of the digitized signal means to preserve the numbers, but also the technology to decode the numbers. ‘Maximum integrity’ means keeping the full quality of the audio and video. As far as possible, the new preservation copy should be an exact replica of the original: the content should not be modified in any way’. This may be difficult to achieve.
The two basic kinds of preservation action are: 1) changing the audiovisual content within a collection, or normalization; 2) changing the system that holds the collection.
There are four main factors in an analogue or digital conservation program:
- packaging (wrappers), handling and storing;
- environmental conditions;
- protecting the masters; and
- condition monitoring, maintaining quality.
The four PrestoPRIME requirements for effective access to time-based media are:
- granularity: division of the content into meaningful units;
- navigation: the ability to select and use just one unit,
- citation: the ability to cite a point on the time dimension of an audio or video file, with a permanent link
- annotation: the ability of a user of content to make time-based contributions
Subscribe to:
Posts (Atom)