Personal computing discussed

Moderators: renee, morphine, Steel

 
BIF
Minister of Gerbil Affairs
Posts: 2458
Joined: Tue May 25, 2004 7:41 pm

Re: Large Hard Drive Defragging

Sun Feb 03, 2013 12:41 am

Well, damn! No wonder I keep breaking screwdrivers and bending nails! :lol:

But I jest. My strategy works for me; I just had to find a way around the problems first. Staggered backups with varying frequencies seems to have done the trick. :) You're right, the "best practices" method would be to run defrags right before full backups. But Diskeeper advertises that it's best to just leave it on 24/7 so that it can move things around even very soon after they were written in a fragmented state. That's a relatively new feature, and I like the possibility that Diskeeper could defrag a file even before it gets backed up the first time after being created or updated.

Since my new system is so much more powerful than anything before it, there's no performance reason to throttle Diskeeper. I like the simplicity of leaving it on!

I'm not using any databases or other journaling software on this system (yet), so a recovery situation (for now) will tolerate a recovery of each partition to a different point-in-time.

The current setup will probably suffice for years to come. And when I install a database (which may be later this year because I have a project in mind), I will probably have to revisit this strategy with respect to the database files and transaction logs. Possibly put them on my office data partition and exclude them from Macrium's backups, then use the batch scheduler to have the database do its own backups for correct handling of locking and concurrency.

Thanks guys; I learned something today!
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Large Hard Drive Defragging

Sun Feb 03, 2013 12:56 am

BIF wrote:
But I jest. My strategy works for me; I just had to find a way around the problems first. Staggered backups with varying frequencies seems to have done the trick. :) You're right, the "best practices" method would be to run defrags right before full backups. But Diskeeper advertises that it's best to just leave it on 24/7 so that it can move things around even very soon after they were written in a fragmented state. That's a relatively new feature, and I like the possibility that Diskeeper could defrag a file even before it gets backed up the first time after being created or updated.

OTOH, if you use a file-based backup solution fragmentation becomes irrelevant. Restore a file-based backup to a fresh drive, and it will actually be *less* fragmented than an image backup/restore of a defragmented drive.

BIF wrote:
Since my new system is so much more powerful than anything before it, there's no performance reason to throttle Diskeeper. I like the simplicity of leaving it on!

To each his own, I guess. I do not like regularly scheduled defrags since it results in extra wear and tear on the drive(s), and also carries a small risk of data corruption (especially if you are not using ECC RAM and/or do not have a UPS).

BIF wrote:
I'm not using any databases or other journaling software on this system (yet), so a recovery situation (for now) will tolerate a recovery of each partition to a different point-in-time.

All modern file systems (NTFS, EXT4, etc.) use journals internally, and can write updated data to blocks other than the ones which were occupied by the original file. This can result in extra blocks getting backed up by a block-based incremental backup tool.

BIF wrote:
The current setup will probably suffice for years to come. And when I install a database (which may be later this year because I have a project in mind), I will probably have to revisit this strategy with respect to the database files and transaction logs. Possibly put them on my office data partition and exclude them from Macrium's backups, then use the batch scheduler to have the database do its own backups for correct handling of locking and concurrency.

Yeah, databases (especially if large) won't play nice with file-based incremental backups either. In fact, for databases you may be better off with the block-based incremental, provided you disable the defrag.

BIF wrote:
Thanks guys; I learned something today!

You're welcome... :D
Nostalgia isn't what it used to be.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On