View Single Post
  #16  
Old 03-21-2007, 12:40 AM
sjs sjs is offline
Registered User
 
Join Date: Feb 2007
Posts: 24
I am happy to report that this method works amazingly well.

Check out the space savings on the destination:

-rw-rw-r-- 1 demo demo 6381662208 Mar 20 21:25 demo.sparseimage
-rwx------ 1 demo demo 9484529664 Mar 17 06:12 demo.sparseimage.orig

now as you can see the only problem is i needed to
chmod 700 demo.sparseimage, but i would imagine SD can do this in a post-script i think...

the amazing this is the old demo.sparseimage.orig, which is about the same size as my source sparseimage is so big despite being routinely compacted by Finder logout.

so doing a fresh copy really is the ultimate compaction or a sparseimage. Zero bloat.

SD didnt eliminate any useless files did it? like Caches or things it might not copy if a root copy was in place? I doubt it would do that, but I could do a diff against the 2 file sets....

Also it was quite speedy. Does the algorithm change when copying to local HDs vs the network? Are you tuned for slower links like rsync?

In any event, I am convinced that filevaults are nothing more than sparseimages. In fact, the method I have been using to date for over 4 years and 5 computers, is I keep moving my sparseimage file from machine to machine over the years (I've been installing OS X, create my user accts in a particular order to preserve UIDs, enabling Filevault on the acct and finally substituting my old sparseimage from the cmd line. It has been working well for years and I'm on my original homedir since 10.0). Perhaps there is a better way now that I am using SD, I havent gotten into all its nooks and crannies what it can do...
Reply With Quote