Best Practices for Home Desktop Computer Backups 2

Posted by JD 11/12/2011 at 03:00

The Checklist

  1. Stable / Works Every Time
  2. Automatic
  3. Different Storage Media
  4. Fast
  5. Efficient
  6. Secure
  7. Versioned
  8. Offsite / Remote
  9. Restore Tested

When you are looking for a total backup solution, those are the things you want from it.

Stable / Works Every Time

I read a few reviews concerning backup tools under Windows. The test was to perform a backup followed by a full restore to a different disk. Then the 2 different disks, source and newly restored disk, were compared. Any difference as considered a failure. Most of the Windows tools restored file systems did not 100% match the source file system, often in very important ways.

Automatic

If any manual steps are needed, eventually, humans will stop doing them. I’ve been there and manually did backups for almost 6 months every week. Then every quarter. Then … er … It is human nature to stop. Since I started doing automatic backups, those happen every night. Sure, about 2-5 times a year the backups fail, but the other 360 times, I have a good backup. Automatically.

Different Storage Media

HDDs tend to fail completely, not just in a single partition. Writing to a completely different storage media is critical. Laptops come with a reinstallation parition, but if the entire HDD is dead, that partition is completely useless. The data, OS, and recovery partition all need to be backed up to different media. Any USB disk or networked disk share works.

Fast

If backups take 5 hours, you won’t do them. OTOH, if they require 3-5 minutes, that isn’t too much of a commitment, so you will.

Efficient

This is about storage efficiency. If you need 30 backups and it required 30x the storage, that is not efficient or practical. There are backup methods and tools that are extremely efficient such that all the data and any data that changes is efficiently compressed. I know this is possible, since I keep 90 days of backups for my personal files from a laptop. The data in my HOME is using 5.06GB of storage and the backup area with 90 days is using 6.55GB of storage. To me, that is such a minor amount of storage that it isn’t worth not having the backups for 3 months.

Secure

If the backups contain sensitive data of any type, encryption is needed. Encryption is really needed when the backups are over an untrusted network, like the internet, or if the backup media can leave the secured location. That means if you use a portable HDD for your backups, then it really needs to have encrypted storage.

Versioned

RAID and mirroring (as with rsync) are not good enough for most backups. This is because corruption of the source files immediately corrupts the 2nd copy or mirror. Corruption can happen for many reasons (hardware, controller, or software, logical errors), so having versioned file backups is critical. If you can restore a file from yesterday AND a different file from last week, then your backups are versioned.

Corruption can happen from computer viruses or rootkits or remote crackers too.

Offsite / Remote

Fire, tornado, floods, earthquakes, thieves. Enough said?

Restore Tested

Until you test your restore, you have nothing. Backups are like insurance. You use them all the time and hope you never need them, but since hardware fails all the time, each of us probably will need to restore at some point. Hope is not a plan. Having a tested, verified, validated restore process is critical.

Fine. I’m Convinced. What Software Should I Use?

The software that gets you doing backups is the software that you should use. If rsync is the tool used to create a mirror every month, that is better than no backups at all, right? I really like rsync for specific uses, but there are better tools for backups. To me, rsync is an 80% tool. It provides 80% of what most people need in a backup. I think we can do better.

The easiest tool that meets all the best practices listed is Duplicati or Duplicity. Both are F/LOSS tools with Duplicati being a GUI version of Duplicity. I will admit that I have only played with it and do not use it myself for backups. That does not detract from the capabilities in any way. It is also cross platform for Windows, Mac and Linux, so everyone can use it. Both of these tools store backups inside volume containers, so restores are slightly more complex than with other tools. You must have the software loaded to restore.

My preferred backup tool on Linux is rdiff-backup. It doesn’t meet all the best practices alone, but it doesn’t prevent us from creating a completely best practice compliant backup solution ourselves. The commands are nearly identical to rsync, so if you use rsync today, then you really owe it to check out rdiff-backup. To restore from the last backup, you do not need the rdiff-backup software, it is just a copy command. I really like that.

If you are new to Linux or perhaps you just prefer a GUI, then Back-In-Time is a tool worth checking out. To restore from the last backup, you do not need
the back-in-time software. I really like that.

Windows Backups

From a completely different perspective, here’s a Windows-centric Best Practices article that also recommends some commercial software.

I’m hardly an expert, but for Windows systems I think there are 2 types of backups needed.

  • System Image (OS and installed Apps)
  • Data Backups (all user files)
Image-Based Backups

This is because we need to have an exact image of the system at home thanks to the usual hardware-to-OS-install requirements for Microsoft Windows Licenses. Businesses have more flexibility, but home users generally get MS-Windows with the PC they bought and that license is tied to the specific hardware. It will not run on other hardware. If you have a retail or upgrade version of MS-Windows this restriction may not apply. I prefer to use Linux where there aren’t any license restrictions on the hardware. Sure, not all hardware is supported by every version of Linux, but at least you know it isn’t something specifically coded to prevent use.

For System Images, you can use any tool you like that does bit-for-bit copies. Some examples are:

  • dd or any of the safe-dd or rescue-dd tools
  • Clonezilla
  • PartImage – my favorite; simply to use and understand

These are not incremental, so they require lots of storage for each run. If you create a system restore monthly, I think you are doing pretty good. I do it about once a quarter. Further, I try to only load software using Ninite installation packages, so that application maintenance is easier and reloading software is 1 installer, not 20. A few commercial Windows apps may need to be manually loaded, but that’s much less effort than manually loading 20+ apps, right? If the app didn’t change since your last system image, then there’s no need to load a new version anyway.
I also make system images before and after a service pack install. I’ve been burned before.

Incremental Data Backups

For incremental data backups, I try to push all the data on Windows to a Linux file server and use Linux backup tools. However, I’ve also used the win32 version of rdiff-backup under Windows. The last version I loaded showed issues with large files, but worked just fine for typical word processing and other similar types of files. A large file is something over 2GB. The recent reviews of Windows specific backup software seems to have issues too. Some didn’t actually backup every file or refused to restore some files, so if you are going this route, please do some testing of both the back AND the restore capabilities. Reading reviews is a really good idea too.

Get Some Backups Going Today

Even if your selected tool isn’t automatic or compress or storage efficient or encrypted, having some backups is better than having no backups. Every once in a while I come across someone using ZIP or TAR to create backups. Usually these people like something simple and easy to understand. I do too, but for about the same effort they can use a simple tool that provides so much more and can be automated and really efficient. Why wouldn’t they do that?

  1. Kari 12/14/2011 at 14:39

    Before one embarks on a program of regular backups one probably has a computer with 500gb of mainly chaff. Downloads, installed but unused programs, music never listened to etc. My machines get like this and it is onerous to identify and delete/keep this stuff. I don’t want to be wasting time backing up junk so as well as a strong backup regimen one needs to be on top of the game when it comes to deleting. By all means download stuff but install it and either then delete the installer package or give it a more intuitive name so you recognise it when it comes to delete time. Keep on top of every download and installation and if it turns out you don’t need it then delete it, don’t back it up.

  2. viric 03/25/2012 at 10:50

    Hello,

    for my backup needs, I found no program that could do what I wanted, and write mine:
    http://viric.name/cgi-bin/btar

    There I did some kind of analysis similar to this :)