Well, damn! No wonder I keep breaking screwdrivers and bending nails!
But I jest. My strategy works for me; I just had to find a way around the problems first. Staggered backups with varying frequencies seems to have done the trick. You're right, the "best practices" method would be to run defrags right before full backups. But Diskeeper advertises that it's best to just leave it on 24/7 so that it can move things around even very soon after they were written in a fragmented state. That's a relatively new feature, and I like the possibility that Diskeeper could defrag a file even before it gets backed up the first time after being created or updated.
Since my new system is so much more powerful than anything before it, there's no performance reason to throttle Diskeeper. I like the simplicity of leaving it on!
I'm not using any databases or other journaling software on this system (yet), so a recovery situation (for now) will tolerate a recovery of each partition to a different point-in-time.
The current setup will probably suffice for years to come. And when I install a database (which may be later this year because I have a project in mind), I will probably have to revisit this strategy with respect to the database files and transaction logs. Possibly put them on my office data partition and exclude them from Macrium's backups, then use the batch scheduler to have the database do its own backups for correct handling of locking and concurrency.
Thanks guys; I learned something today!