It s not enough to merely encrypt data; you must provide and use appropriate procedures or risk invalidating your efforts. In particular, a full set of restoration procedures and thorough testing are required. Whatever method you use to encrypt archive media will create the need to integrate the archive s decryption as part of any effort to restore the archive. If not properly implemented, data encryption may just provide a false sense of security. If you use only data encryption alone, you have not provided a secure foundation for your system s data. Key management and access policy restrictions on both live and offline data are particularly important. Without some policy to control who can do what with backups , situations can occur where groups do not communicate well and data can be inadvertently destroyed . The same is true of key protected material, where problems may exist with the keys as well as with the data the keys are there to protect.
Use proper document retention policies in conjunction with backups. In many businesses, attention is given to document retention and association only in the matter of accounting data. While it is good to back up the databases associated with doing a monthly, quarterly, or year-end financial report, associated source documentation files should also be backed up. If you are the person responsible for scheduling backups, rotating tapes, and doing the retention scheduling on backup media, it s important to know if the data you are responsible for carries legal requirements that exceed the actual organizational use of that data.
Proper storage of media is an important part of any backup policy. Most media decays, and electronic media is vulnerable to the ravages of time. Currently, CD-ROM discs have the best shelf life, with DVDs a close second, but both are very quality-dependent. Some DVDs are quite fragile, and the layering technology that is used is where this will often manifest. If you are planning on using DVD media for long- term storage, check the media quality first. The more durable a disk seems, the more durable it probably is. CDs and DVDs should be stored in a cool, dark environment, away from ultraviolet light sources.
Magnetic media has gotten much more reliable than it was in the past, and for this reason it is important to look closely at a vendor s MTBF (mean time between failure) figures and whatever life cycle information they provide. The actual media itself can suffer from oxidation over time, although newer tape materials have drastically reduced this problem. Tape can also become brittle with age and break easily upon attempted use. There is another and far more common problem that comes into play with magnetic tape storage, however: inadvertent demagnetization. Often tapes that are being shuttled offsite or stored locally end up in aluminum cases or plastic tubs. Storing these cases near magnetic sources or placing tubs near computer monitors can play havoc with the contents. If the floor is regularly vacuumed near the containers, each time a strong electromagnetic motor is oscillating at a very high rate near the tape.
In addition to preserving data, it is inevitable that some data will also need to be safely and thoroughly destroyed. There are several ways to destroy a CD or DVD. They can be broken, but the shattered fragments can be sharp and dangerous. They can be scarred to the point of destruction, and removing portions of the silvered media on CDs will do a good job. There are CD/DVD shredders that may be a worthwhile option.
Destroying data stored on magnetic media is usually accomplished on an individual tape basis by erasing it in a drive. When there are too many tapes to be erased by tape drives , you should purchase a tape demagnetizing device. Some are portable hand units, others are tabletop devices. When tapes are ready for recycling, their disposal does not generally require specialized media destruction procedures.
The expiration of public and private keys associated with an archive should closely match that of the expected lifetime of the archive, or you could find yourself with files with no keys or keys with no files. This doesn t present a problem if complete destruction of the documents contained in the archive is your goal, as without keys decryption of your media becomes nearly impossible even with fairly weak encryption. However, the destruction of keys should be entirely within your control, and if you have created good procedures, should not happen due to misunderstanding.
To secure backups, keep them offsite. This can mean anything from someone taking a few tapes back and forth from work every day to data storage in locked containers in a storage bunker. The primary goal is usually to secure copies of the data against disaster and theft. Properly handled, backups can help secure data against disaster, however, theft and mishandling cannot always be handled by storage placement decisions. If data is kept in a third-party archiving facility, the main concern will be one of mishandling, as it is easy for confusion to develop without prior agreement over how to handle the data stored offsite. If the other location is another company office, confusion over storage and usability of the media can come into play, so it is a good idea to develop shared procedures across facilities. If you have a small to medium sized business and are considering any sort of rented storeroom, or home storage, it is very important to consider the possibility of casual theft or fire. The best way to combat these is through the use of a fire safe. There are many vendors for these devices, and they can be found at many home improvement stores as well.
There are some handling procedures that are beyond transport. At a minimum you need to have two people in any organization know what the critical backups are and where they are stored, as well as the procedures for doing so, and preferably there should be more than two. If the backups contain data that is material to the conduct of day-to-day business by the directors and officers of the corporation, or has a direct impact on profit and loss activities, or is in some other way deemed mission-critical to business continuity, the people who know what to do with the backups should not travel together if at all possible.
It is a good idea to securely remove plaintext source files from any system you do not fully trust under all circumstances. Further, given that no system is 100 percent trustable, it may be a good idea to store the data encrypted, except when you re actually working with the data. This reduces your vulnerability to the latest bugs and exploits, and can leave you much less exposed to the mercies and whims of those responsible for patching and debugging potential problems.
When a file is encrypted, the process of encryption will leave the original, plaintext file in place unmodified. Some software allows the removal of source files as an option, but this is a non-cryptographic feature and is not always present. If a software package says that it removes a file, this will in most cases amount to a simple removal and not obliteration of the source data. Some packages will offer a choice to obliterate the data, and it is important to always do so. The normal file removal process leaves cleartext data intact, and it is quite common for old data to lurk on disks for a very long time. Overwriting a file thoroughly removes what information was there, and scrubs trace information from the disk while it is doing so. Since it is possible to look at data remnants even on a live system, using a separate server to stage files, overwriting, and controlling access to encryption keys are all critical steps.
To securely remove files, use the shred command. It is part of the GNU fileutils package, and thus is common to Red Hat and SUSE distributions. The program can be found as /usr/bin/shred and is far better than the simple rm command. Using rm does not erase the data that was contained in a file, it removes the linkage to the file and marks the space as being available to the operating system for reuse. Files that have been removed with rm are restorable using a variety of easily obtainable commercial and open source software. In many cases, even if the full file itself is not restorable, partial restoration is possible. In all cases, if the data is still on the disk in the clear, it can be viewed if the interested party has the necessary access. Viewing does not require complex machines, or even that a disk be removed from a system, although under some circumstances this is done as a common procedure to isolate the disk from further changes. While it is true that files deleted from a Unix system are not as easily restored nor is the data quite as easily accessed as those under other operating systems, it is not at all true that the data is lost forever or completely inaccessible. The structure of Unix file systems is harder to cope with than that of other products, but the basic rules of data still apply: until it s overwritten, the data is still there. Partialy this is due to market saturation, but it s also due to a level of complexity that exists in the designs. There are undelete utilities and such for most Unix file systems, but they are not common software. Mainly they end up being used by people who do forensic work or data recovery, but they are quite freely accessible, and they can be put to use in information gathering. It is therefore prudent to overwrite the cleartext of files to whatever degree is possible when they are destined for encryption.
shred is an accessible and easy-to-use command that serves well to obliviate information contained in cleartext to a reasonable degree, though it does have limitations. Logs or journaled file systems and file systems that use caches are cases where the information contained in a file that s been shredded will not be fully and immediately removed. However, using shred on files in these types of file systems does significantly raise the amount of effort that must be given to access any remaining fragments of data. Subsequent file system traffic will eventually remove all traces of removed files if such activity is sufficient, but you cannot count on this occurring unless you know the affected system receives adequate amounts of traffic to create these changes.
To use shred to remove and overwrite files, it should be invoked like this:
# /usr/bin/shred -u < filename >
The -u flag tells shred to remove the file after overwriting it. The default is to overwrite it 25 times without removing it. On particularly large files this may consume an excessive amount of time, and the number of times the file is overwritten can be reduced via the -n N switch, where N is the number of times you wish to overwrite the file. To invoke it while removing files, use the following command:
# /usr/bin/shred -n10 -u < filename >
This would overwrite the file 10 times and then remove it entirely. If your file operations need a higher level of security than this, it is possible to do your encryption and decryption operations inside the /dev/shm directory. This is a shared memory file system, and thus is suitable for these sorts of traceless file operations, but this method comes with a cost of memory limitations. The main virtue of doing this is that there is a minimum of data remnants created during the encryption processing efforts. Such remnants, whether deleted but not overwritten files, partially overwritten segments of files, or remainders of filenames, can all be used to extract information that encryption is trying to protect.