- Stable / Works Every Time
- Different Storage Media
- Offsite / Remote
- Restore Tested
When you are looking for a total backup solution, those are the things you want from it.
Stable / Works Every Time
I read a few reviews concerning backup tools under Windows. The test was to perform a backup followed by a full restore to a different disk. Then the 2 different disks, source and newly restored disk, were compared. Any difference as considered a failure. Most of the Windows tools restored file systems did not 100% match the source file system, often in very important ways.
If any manual steps are needed, eventually, humans will stop doing them. I’ve been there and manually did backups for almost 6 months every week. Then every quarter. Then … er … It is human nature to stop. Since I started doing automatic backups, those happen every night. Sure, about 2-5 times a year the backups fail, but the other 360 times, I have a good backup. Automatically.
Different Storage Media
HDDs tend to fail completely, not just in a single partition. Writing to a completely different storage media is critical. Laptops come with a reinstallation parition, but if the entire HDD is dead, that partition is completely useless. The data, OS, and recovery partition all need to be backed up to different media. Any USB disk or networked disk share works.
If backups take 5 hours, you won’t do them. OTOH, if they require 3-5 minutes, that isn’t too much of a commitment, so you will.
This is about storage efficiency. If you need 30 backups and it required 30x the storage, that is not efficient or practical. There are backup methods and tools that are extremely efficient such that all the data and any data that changes is efficiently compressed. I know this is possible, since I keep 90 days of backups for my personal files from a laptop. The data in my HOME is using 5.06GB of storage and the backup area with 90 days is using 6.55GB of storage. To me, that is such a minor amount of storage that it isn’t worth not having the backups for 3 months.
If the backups contain sensitive data of any type, encryption is needed. Encryption is really needed when the backups are over an untrusted network, like the internet, or if the backup media can leave the secured location. That means if you use a portable HDD for your backups, then it really needs to have encrypted storage.
RAID and mirroring (as with rsync) are not good enough for most backups. This is because corruption of the source files immediately corrupts the 2nd copy or mirror. Corruption can happen for many reasons (hardware, controller, or software, logical errors), so having versioned file backups is critical. If you can restore a file from yesterday AND a different file from last week, then your backups are versioned.
Corruption can happen from computer viruses or rootkits or remote crackers too.
Offsite / Remote
Fire, tornado, floods, earthquakes, thieves. Enough said?
Until you test your restore, you have nothing. Backups are like insurance. You use them all the time and hope you never need them, but since hardware fails all the time, each of us probably will need to restore at some point. Hope is not a plan. Having a tested, verified, validated restore process is critical.
Fine. I’m Convinced. What Software Should I Use?
The software that gets you doing backups is the software that you should use. If rsync is the tool used to create a mirror every month, that is better than no backups at all, right? I really like rsync for specific uses, but there are better tools for backups. To me, rsync is an 80% tool. It provides 80% of what most people need in a backup. I think we can do better.
The easiest tool that meets all the best practices listed is Duplicati or Duplicity. Both are F/LOSS tools with Duplicati being a GUI version of Duplicity. I will admit that I have only played with it and do not use it myself for backups. That does not detract from the capabilities in any way. It is also cross platform for Windows, Mac and Linux, so everyone can use it. Both of these tools store backups inside volume containers, so restores are slightly more complex than with other tools. You much have the software loaded to restore.
My preferred backup tool on Linux is rdiff-backup. It doesn’t meet all the best practices alone, but it doesn’t prevent us from creating a completely best practice compliant backup solution ourselves. The commands are nearly identical to rsync, so if you use rsync today, then you really owe it to check out rdiff-backup. To restore from the last backup, you do not need the rdiff-backup software, it is just a copy command. I really like that.
If you are new to Linux or perhaps you just prefer a GUI, then Back-In-Time is a tool worth checking out. To restore from the last backup, you do not need
the back-in-time software. I really like that.
From a completely different perspective, here’s a Windows-centric Best Practices article that also recommends some commercial software.
I’m hardly an expert, but for Windows systems I think there are 2 types of backups needed.
- System Image (OS and installed Apps)
- Data Backups (all user files)
This is because we need to have an exact image of the system at home thanks to the usual hardware-to-OS-install requirements for Microsoft Windows Licenses. Businesses have more flexibility, but home users generally get MS-Windows with the PC they bought and that license is tied to the specific hardware. It will not run on other hardware. If you have a retail or upgrade version of MS-Windows this restriction may not apply. I prefer to use Linux where there aren’t any license restrictions on the hardware. Sure, not all hardware is supported by every version of Linux, but at least you know it isn’t something specifically coded to prevent use.
For System Images, you can use any tool you like that does bit-for-bit copies. Some examples are:
- dd or any of the safe-dd or rescue-dd tools
- PartImage – my favorite; simply to use and understand
These are not incremental, so they require lots of storage for each run. If you create a system restore monthly, I think you are doing pretty good. I do it about once a quarter. Further, I try to only load software using Ninite installation packages, so that application maintenance is easier and reloading software is 1 installer, not 20. A few commercial Windows apps may need to be manually loaded, but that’s much less effort than manually loading 20+ apps, right? If the app didn’t change since your last system image, then there’s no need to load a new version anyway.
I also make system images before and after a service pack install. I’ve been burned before.
Incremental Data Backups
For incremental data backups, I try to push all the data on Windows to a Linux file server and use Linux backup tools. However, I’ve also used the win32 version of rdiff-backup under Windows. The last version I loaded showed issues with large files, but worked just fine for typical word processing and other similar types of files. A large file is something over 2GB. The recent reviews of Windows specific backup software seems to have issues too. Some didn’t actually backup every file or refused to restore some files, so if you are going this route, please do some testing of both the back AND the restore capabilities. Reading reviews is a really good idea too.
Get Some Backups Going Today
Even if your selected tool isn’t automatic or compress or storage efficient or encrypted, having some backups is better than having no backups. Every once in a while I come across someone using ZIP or TAR to create backups. Usually these people like something simple and easy to understand. I do too, but for about the same effort they can use a simple tool that provides so much more and can be automated and really efficient. Why wouldn’t they do that?
My old backup method was a little cumbersome. To ensure a good backup set, I’d take down the virtual machine, mount the VM storage on the host (Xen), then perform an rdiff-backup of the entire file system, before bringing the VM back up again. This happened daily, automatically, around 3:30am. It has been working for over 3 years with very few hiccups. I’ve had to restore entire VMs and that has worked too. One day I needed to restore the Zimbra system ASAP. From the time I decided to do the restore until end-users could make use of the system was 20 minutes. That’s pretty sweet in my book.
There are some issues with the current setup.
- Backups are performed locally, to a different physical disk before being rsync’ed to the backup server. This is necessary because the backup tool versions are different and incompatible between Ubuntu 8.04 and 10.04 LTS servers.
- Each system is completely shutdown for some period of time during the backup process. It is usually 1-4 minutes, but still that is downtime.
- Most of the systems are still using 8.04 paravirtual machines under Xen. A migration of some type is needed to a newer OSes. I should use this opportunity to make things better.
- Some of the systems are running old versions of software which are not up to current patch levels. I guess this happens in all IT shops. None of that is available outside the VPN, so the risks are pretty low.
think I can do better.
Since this is a technology blog, I figure some of you may be interested in a major change that happened out of necessity here today.
This is the very first blog article on our new physical server, running in a completely different virtual machine. For the next week, everything here is a test.
Due to some sort of outage issue earlier today, I was forced to upgrade everything involved with this blog. I had attempted to perform this upgrade previously and failed. As you can see, this time, there was success. Nobody was shocked more than I.
Below is the 3rd of 6 questions from a reader. I definitely don’t have all the answers, but I’m not short on opinions. ;)
Laurens Duijvesteijn asks:
Q3: I intent (sic) to provide quite a lot of media to my internal network, if I choose for virtualisation, will the VMs be able to access the disk space outside of the container? I do not want to create TB size containers (or should I?). I will probably use the SMB protocol here.
I decided to write this entry after reading an article over a Lifehacker by Whitson Gordon titled What Kind of Maintenance Do I Need to Do on My Windows PC.
What kind of maintenance do I need to do on my Ubuntu/Debian/APT-based PC? Good question. It is pretty simple … for desktops. This article is for APT-based desktop system maintenance, NOT for Linux servers. Linux servers need just a little more love to stay happy. I haven’t used RPM-based distros in many years, so I’m not comfortable providing commands to accomplish the things you need to do, but the methods will be similar.
Let’s get started.
Install System and Application Patches/Updates
This will patch the OS and all your applications.
$ sudo apt-get update; sudo apt-get dist-upgrade
Read about more tips below.
The folks over at PenDriveLinux have been busy. They have a new version of their multi-boot creation tool for flash drives, YUMI (Your Universal Multiboot Installer). YUMI-0.0.1.7.exe is the current released version, replacing MultibootISO.
The MultibootISO tool never worked for me. I was using unetbootin to load a single ISO onto a single flash drive, but often I’ve needed gparted, then DBAN, then PARTIMG, then an full Linux like Ubuntu 10.04 or Puppy or TinyCore. With YUMI, you can have all of those on a single flash drive and select which to use at boot time. It seems to work fine.
They finally added an Unknown ISO option so ANY ISO you have with a distro can be added to the boot menus. The boot-up screens are automatically organized nicely by type of tool.
I just placed about 5 ISO files onto a single 2GB flash drive. As I write this, Android-x86 is booting on a netbook. SWEET! I can’t wait to try it out for an hour or so before trying out the new MeeGo x86 release. As long-time readers know, I run Maemo today, so MeeGo would be the next update for that device.
Well, I’ve attempted to boot 3 different OSes.
- MeeGo failed almost immediately.
- Lubuntu displayed the boot screen, asked for a language and eventually failed.
- Android x86 was left to boot for over 30 minutes – the ……………. just kept coming.
The gparted ISO that I specified didn’t show up in the boot menu – I used a different ISO at the 3rd decimal point – mine was newer. I probably should have put it into the Unknown ISO group.
Some Good News
SpinRite did work perfectly. It is running now across all the partitions to refresh any lazy bits.
I moved the gparted ISO into the Unknown ISO group. Hopefully, it will work better there.
Many of us backup important data to optical disks like CDROM or DVD media. Over time, that media is known to fail. This means that every 5-10 years, a plan to migrate all the critical data to newer media needs to be included. It also means that when data is stored to this type of media, steps should be taken to protect the data. Recently, I had a need to pull some data, old family movies, from a DVD. The movies were stored as xvid/mp3 data inside an AVI container. Anyway, after loading the disk onto a network drive, the movie began playing, then abruptly stopped about 2 minutes into the hour long movie. I have other copies on other media … somewhere, but this would be a good opportunity to try a contingency plan that I’ve been using for at least 10 years.
Read more below.
Today I wanted to add another OS to a netbook, an Asus Eee. My common practice is to boot a gparted ISO from a USB flash drive, move some data and partitions around and add a new logical partition to the end of the extended partition space. Write everything back out to disk. Then I’d boot the install disk/ISO and install to that newly created partition. Life was good, usually.
Today, I was greeted with gparted showing unallocated for the entire drive, all 160GB – unallocated. Ouch. This is the first time I’ve had partition table issues, ever, in over 20 yrs.
Ok, not really 101 uses for a Password Manager, but many more than you thought, about 30.
Use A Password Manager
For the last few years, I’ve been trying to get anyone with more than 5 passwords to remember to start using a password manager, PM, as part of increasing your desktop security. Below I’ll go into a few alternate uses for that password database beyond just storing computer and website passwords.