View Single Post
  #7  
Old 05-08-2006, 01:44 AM
chemokid chemokid is offline
Registered User
 
Join Date: May 2006
Posts: 3
Quote:
Originally Posted by dnanian
OK, for this you'd:

- On that same day, either delete the corresponding daily image with an "after copy" script or schedule another script to do a smart update of a daily image. I'd suggest using "Backup - user files" instead of "Backup - all files", since it sounds like that's more like what you're trying to protect.
- Create four more scheduled, set for every day, on those same weeks, to their own files (removed above) with the same settings, except "Copy Different" rather than "Smart Update". This is assuming the daily backup is the same file.

I think this'll do what you're looking to do. Personally, I think it's a bit over the top, but you're a better judge of what you want...
Hi, just adding my backup plan here as well since I am also setting up weekly backups of user files except I don't need to do any daily incremental updates. I am also using the "Backing up over a network" chapter from the PDF documentation as a reference.

To start off, one would create read/write sparse disk images on a network share called week1, week2, week3 and week4. Select smart update from the "During copy" menu for each image.

This should create four identical images from the first day of running SuperDuper!

2. Then on the appropriate last day of the week, select that week's image and do a smart update. Last day of next week, run a smart update on that week's image.

This should create incremental weekly images correct? Along with this schedule I would compact the images at the end of each month using the tips mentioned here:

http://www.shirt-pocket.com/forums/s...ead.php?t=1115

Am I missing any steps here that I should add or change?
Reply With Quote