Murphy was right. If something can go wrong, it will... eventually. Earlier this week, the water main connected to our house burst, submerging much of the laundry room in inches of water. The house's main shut-off valve was little help, since the break was between the house and the main line from the city, so I spent several hours bailing until the water could be turned off completely.
The laundry room sits just behind my office and, thankfully, a little bit below it. The water level didn't rise high enough to trickle into the Benchmarking Sweatshop. It did soak a few older motherboards and graphics cards, though; the laundry room also doubles as storage for the endless stream of hardware that FedEx delivers to my door.
Fortunately, the damage appears to be minimal. It certainly could have been worse. Not that long ago, my file server was sitting in what became the flood zone. But it, too, suffered a spectacular failure. While I was visiting family over the Christmas holidays, two of the three drives in the systems's RAID 5 array died. A sagging 5V rail in the PSU was to blame, and my 2TB array was toast. I probably should have been keeping a closer eye on the system, but home file servers are the sort of thing one stuffs into a closet and kind of forgets about. This one had been running for years without so much as a hiccup.
My initial response was panic. Two terabytes of data was gone: high-bitrate MP3s ripped carefully from my collection of over 500 CDs, countless digital photos, priceless home movies, a decade's worth of TR-related files, and a healthy helping of, er, Linux ISOs that would take forever to grab off BitTorrent again.
Wait, I have backups!
A couple months earlier, I'd backed up the entire file server to a single 2TB hard drive that was sitting on the shelf in my office. It wasn't completely up to date, but almost everything that was missing was sitting on other machines. My desktop has had its contents protected by a RAID 1 array for years, and it's a dumping ground for most of my data. A fairly recent version of the essential stuff is also kept on my laptop and on the USB key on my keyring. There's an old notebook drive sitting at my parents' house loaded with my most critical data, too.
In the end, I only ended up losing a couple days worth of benchmark data and a few frantic hours. But then I've always been pretty good about keeping things backed up. It all started with my high-school computer lab teacher, who would randomly turn off entire banks of machines to make sure we saved our work regularly. Thanks for the compulsive Ctrl+S tick, Mr. Knowles.
For years, my data was protected by a mix of RAID 1 on the desktop and a closet file server with its own array. Scheduled DOS batch files copied gigabytes from my desktop nightly, and the server was backed up to a separate hard drive periodically. When I moved to Windows 7, the batch files were replaced with the OS's built-in backup routine, which has the handy ability to create an entire system image instead of just saving a selection of files.
Although I've considered resurrecting my file server, the home-theater PC in the living room has been filling in admirably. I tossed in an extra low-RPM hard drive, which doesn't add much noise, and I could even do RAID if this turns into a permanent solution. It likely will, if only because that will save me the trouble of putting together—and monitoring—a new box. Plus, most of my home storage is media, which makes sense to have in the living room.
Windows can shuffle files between systems on a home network easily enough, but getting them onto an external drive is an extra step. It's also an additional backup job on top of my nightly network copy. This creates problems for Windows 7, whose backup routine doesn't support multiple jobs.
There are, of course, numerous external hard drives that come with their own backup software. Thing is, an external drive is really no safer than the secondary drive in my HTPC, which at least sits in a different room than my desktop. To be truly secure, data really needs to be duplicated at an off-site location. Doing that manually takes actual effort.
Fortunately, cloud-based storage has become a viable solution... provided you trust anyone else with your data. Even if you don't, files can be encrypted beforehand and uploaded once scrambled. Free options abound, with Dropbox, SkyDrive, and now Google Drive offering gigabytes of remote storage. None of those services have enough free capacity to meet my needs, though. Ideally, I need hundreds of gigabytes to keep all my precious data safe.
While I'm loathe to shell out for online storage when I have terabytes of disk capacity sitting idle in my lab, I'm also realistic about how often off-site backups happen around here—and how many close calls I've had in the last few months. It's worthwhile for me to pay to have software take care of the problem. Since our resident developer likes CrashPlan so much, I gave the free trial a shot. After a month of it sitting unobtrusively in my system tray, silently backing up files without me even noticing its presence, I sprang for two years of unlimited storage for $90. That's less than the cost of the average terabyte hard drive, and it comes with a lot more peace of mind.
I'm not really worried about someone hax0ring my CrashPlan account and digging through my data, but it's nice to know that the service has strong encryption and the ability to set a private password that even the company's techs won't know. The CrashPlan app allows local backups, too, but there's no native support for networked shares, which is a little annoying. I'm more interested in CrashPlan's ability to use other computers as backup sites. The app needs to be running on the target systems, but users can create their own cloud to complement—or supplant—CrashPlan's own servers. This feature is included in the free version, although it's limited to once-a-day backups rather than the real-time approach taken by the full-blown product.
Once my initial dump has finished uploading to CrashPlan's servers, I'll have three layers of protection, all fully automated. I'll sleep better at night knowing I'm no longer the weak link for my off-site backups. But I'll also keep refreshing my USB key and filling the occasional backup drive because, hey, you never know. If Gmail can go down, surely CrashPlan can, too.
|OCZ Vertex 450 SSD has 20-nm NAND, tweaked Indilinx controller||13|
|WHQL-certified GeForce 320.18 drivers now available||8|
|Nvidia's GeForce GTX 780 graphics card reviewed||110|
|Fingertip-sized Serial ATA SSD boasts 480MB/s data rate||8|
|Fractal Design lists Haswell-compatible PSUs||22|
|Mirasol lives, 1.5-inch display is coming 'soon'||18|
|Toshiba to start producing second-gen 19-nm NAND this month||20|
|Microsoft reveals next-generation Xbox One console||307|