Have you ever lost parts of your life (e.g. photos, documents, essays, etc.) because your computer let out its last sigh and shut down, not giving you a chance to save your work and memories – or to at least say goodbye? If not, either you are unbelievably lucky, or thoughtful enough to have taken preventive measures to avoid this heartbreaking experience. If you have got burnt once or twice, you probably eventually acquired the habit of zealously protecting your files by backing them up on several devices and online platforms. As computers entered our daily lives over the past few decades, most regular computer users developed strategies to safeguard their digital records from the whims of chance. Fewer are aware, however, of two other threats – the processes of decay and obsolescence.
For some reason, it comes as a surprise that data deteriorates over time. An explanation for this misconception of permanence can perhaps be sought in the idea of the digital as “immaterial”, and hence immune to the forces of decomposition that gradually take apart the material world. Data rot (aka “bit rot”) occurs as the code of ones and zeros that constitutes anything digital gets messed up. Often the reason for its occurrence is the deterioration of the material storage mediums where files are kept. For instance, physical damage to the CD or DVD, e.g. scratches, could make them unreadable. In addition, every time a file is opened, copied, or transferred, there is a slight risk that the strings of code will not be read properly by the software used or that bits could be lost. In the long run, both the accumulation of such erosive chance events and the deterioration of the storage medium can result in corrupted files that cannot be accessed properly or opened at all.
Computer users tend to be more familiar with obsolescence, as their browsers and operating systems would normally inform them if their versions of software are outdated. These warnings, however, are so commonplace now that they do not necessarily inspire a deep reflection about the process behind them. Both software and hardware are part of incessant digital “evolution”: they are constantly improved to accommodate users’ needs and to offer new opportunities for interaction within the digital world. As that process of enhancement goes on, data needs to adapt in order to remain in use. But what happens when a medium or a piece of software cannot keep up? The floppy disc has already been driven to extinction, while Adobe Flash is presently endangered. The hard disk of your computer will not live forever either. As the curator of the Computer History Museum in Silicon Valley, Dag Spicer, has put it: ‘The computer industry is one of planned obsolescence.’* Or, to extend the metaphor of evolution, it’s a real survival of the fittest!
Whereas bit rot and obsolescence concern anyone interacting with digital content (i.e. most people with computer access), tackling these issues is of paramount importance for institutions dealing with archival activity. The turn to the digital is exciting and creating possibilities for new and innovative interactions with materials that have previously been hidden behind the walls of libraries, archives and museums. However, digitisation does not end with uploading the images and releasing them to the users. Institutions must ensure that their digital collections are future-proof. Prior to conducting digitisation projects, they need to consider the means they employ to deliver their treasures to the outside world: how long will the chosen formats and software stay around? How often will they need to be updated or checked? Will they be compatible with most browsers? All these questions need to be answered in order to produce a high-quality digitised product.
The work does not end past the point of releasing the images: digitisation extends indefinitely through the ongoing process of digital preservation. The institution might conduct digital preservation on its own or outsource it: there are businesses dedicated to taking care of digital content. The specific procedures involved in the process vary from case to case, but several steps of the process are essential to the high-quality performance of any such service. First comes evaluation of what digital content has been produced, where it is stored and whether it is valuable – not everything needs to be kept. Metadata and the legal status of the content (e.g. whether it is in copyright or under data protection) are also taken into account. The next steps are risk assessment and preventive measures, e.g. using special software to convert files at risk into more stable formats. From then on, the institution should conduct regular checks of the state of the files and the access to them to ensure that they will remain stable and usable.
Once created, digital materials cannot be left unattended. Just as physical objects require care and attention to sustain their integrity, digital ones need maintenance, at times more complicated and time consuming than that of their material counterparts. Without doubt, computers have brought about immense improvements in our lives – at quite a rapid pace, too. However, their evolution continues just as rapidly. In order to keep up with it, both institutions and we the personal users need to adapt to its speedy transformation. We have to come to terms with the fact that, unlike a handful of souvenirs that can be put in a box and forgotten in the attic to be discovered years later by a curious grandchild, our digital possessions might not survive complete abandonment in the dark of the hard disk.
* https://pogue.blogs.nytimes.com/2009/03/26/should-you-worry-about-data-rot/
Written by Mila Daskalova, MSc student in Book History and Material Culture, University of Edinburgh.