Which brings me to today’s backup conundrum, which comes courtesy of IT Week UK: if you back up a corrupted file (or system), then instead of having a working copy to fall back on, you just have two corrupt copies.
Now, obviously, if you know a file is corrupted or that your system is infected with a virus, you’re not going to back it up. But automated backup systems, on-site or off, aren’t that smart. Either they overwrite files completely with the newer version, or they overwrite the bytes that have changed—including changes made by data corruption or viruses. Not good.
One professional geek I know goes so far as to claim that drive images are useless, because the ones clients bring to him along with their malfunctioning computers are so full of viruses, spyware, and other problems.
As I’ve said before in this newsletter, make sure you clean out your system before creating a drive image. Scan it for adware, spyware, and viruses. (And make sure you turn off Windows System Restore before removing viruses, or you’ll risk returning to an infected state.)
Moreover, if you use disk cloning software, make sure you keep more than one image of your drive, starting with the bare operating system. This gives you a better chance of having a functioning system to return to. I keep an image of the operating system, one with the OS plus the drivers for all my peripherals (printer, scanner, handheld PC, Wacom tablet), one of the drive with major software installed, and then start making weekly images which include data, of which I keep the most recent 2-3, depending on how much space is available on my XHD.
But what about corrupted documents, the specific issue which moved Keyvyn Taylor to write his piece for IT Week? File corruptions are a cross-platform problem: I’ve experienced them on both Macs and PCs.
One option is to rename the file each time you work on it, so that you keep multiple versions and won’t lose the whole thing. This takes up more storage space, of course, but it might be worth it for important client files and work projects. The more you use and change a file, the greater the possibility of corruption.
Doing this several times a day could get to be tedious, though, and if the document is stored on a network server, you have to make sure everyone uses the same version-naming protocol, or people who need the file will have trouble finding it and knowing which version is most recent. Taylor suggests creating a plug-in for Microsoft Office which would perform this task for you. It’s an interesting idea, but well beyond my abilities. It’s probably easier than creating a backup program which verifies file structures as well as matching bytes, though.
As always, I welcome your suggestions for addressing this issue, and any stories you have about your experiences with infected backups.
Remember: the copy is only as good as the original. Backups you don’t have to think about making can have drawbacks.
It is always good to have multiple generations of HD images, and with software such as Ghost, individual files can be retrieved and it can be lifesaver. With dropping HD prices, the cost of backup storage is negligible compared to the cost of data loss. Even with XHD, I would invest in two of them, just in case something happens to one of them. In one of the previous reminders, mention was made of Bart’s Bootable CD. I created a bootable CD with Ghost 2003 and it works like a charm. Creating Bart’s Bootable CD is not for faint at heart. Needs some in depth computer knowledge.Ramadoss
Ramadoss is right: it’s good to have more than one XHD. I have two at the moment, and will need a new, larger one when I get a new computer with a larger internal drive.The Ur-Guru made my copy of the BART-PE CD—my own attempts resulted in nothing but frustration.And speaking of the Ur-Guru, he had this to say about the IT Week story:******************************”If your doc is stored on a network server I would assume that they (whoever runs the network) would have set it up so that either shadow copy is active, or some document management system of a 3rd party, or perhaps some VCS/RCS style system that will store a latest copy but has full backtracking to any previous copy.”That’s what document management systems were designed for so you don’t need anything on the client side… let the server handle the updates and copies and versioning. That’s also what version/revision systems were designed for. And it’s partly what shadow copy can do, in a more crude and space-wasteful way.”Why put all the responsibility or technology out on multiple client systems and have users deal with it when you *have* a central server… let the server deal with it. One point of failure vs. many points of failure.”********************************That makes sense to me, from the little I understand about servers and document management systems. I’ve suggested he take it up with Keyvyn Taylor, as most of my e-zine readers don’t have servers.