Shirt Pocket Discussions  
    Home netTunes launchTunes SuperDuper! Buy Now Support Discussions About Shirt Pocket    

Go Back   Shirt Pocket Discussions > SuperDuper! > General

Reply
 
Thread Tools Rate Thread Display Modes
  #1  
Old 12-19-2016, 01:13 PM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
question about "bit rot"

I've been thinking about hard disks as long-term storage, and I've realized that magnetic domains on hard disk actually aren't permanent. The field strength decreases by more than 1%/year. So even for a hard disk that is never used, your data on it will gradually fade away. Errors will increase on time scales of years. This is well understood.

As far as I can tell, the only way to prevent this is to completely rewrite your hard disk data once in a while. Every year or two? Very little cogent info about mitigation strategy that I can find.

That makes me wonder about SuperDuper smart updates, which just write things that have changed. Things that haven't changed don't get rewritten. So after a while, much of your archive disk data can get stale. Is this a concern? As in, is it smart to do a full up backup every once in a while instead of smart update? Some expert advice would be handy.
Reply With Quote
  #2  
Old 12-19-2016, 01:38 PM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
I take a major OS update as an opportunity to do a full backup, Dan. On top of that, I use a number of backup volumes, some of which are RAID and get scrubbed, to ensure that things are generally safe. Plus, I use online backup... your best approach is thorough, diverse coverage.
__________________
--Dave Nanian
Reply With Quote
  #3  
Old 12-21-2016, 09:44 AM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Quote:
Originally Posted by dnanian View Post
I take a major OS update as an opportunity to do a full backup, Dan. On top of that, I use a number of backup volumes, some of which are RAID and get scrubbed, to ensure that things are generally safe. Plus, I use online backup... your best approach is thorough, diverse coverage.
I certainly agree that diverse coverage is good. But if you have a few archive disks that you only do incremental SuperDuper backups on, there is a good chance that after a decade none of them will work.

Are you aware of any Mac utilities that do disk refreshing? I think there used to be one, called DiskRrefresher, but that's no longer available. There are PC utilities that do this. There was, many years ago, a post about doing disk refreshes in a Unix-based system ...

https://larryjordan.com/articles/tec...-disk-storage/

but I frankly think those techniques don't really do what he says they do.

It would be nice to see some well-thought-out strategy for long term archiving on hard disks. I realize that's not necessarily the purpose of SuperDuper, but your insights would be welcome.
Reply With Quote
  #4  
Old 12-21-2016, 10:15 AM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
Well, again, the scrubbing does refresh the data (as, obviously, does the yearly rewriting). So, if you consider your NAS-based backups "archival" and have them scrubbed, they seem reasonably safe to me.

As far as there being a good chance that after a decade none will work - I've been surprised that most drives *do* work just fine after a decade, retaining all their data. Which, of course, is no guarantee.
__________________
--Dave Nanian
Reply With Quote
  #5  
Old 12-22-2016, 09:47 AM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Quote:
Originally Posted by dnanian View Post
Well, again, the scrubbing does refresh the data (as, obviously, does the yearly rewriting). So, if you consider your NAS-based backups "archival" and have them scrubbed, they seem reasonably safe to me.

As far as there being a good chance that after a decade none will work - I've been surprised that most drives *do* work just fine after a decade, retaining all their data. Which, of course, is no guarantee.
Thank you. But I've always looked at "scrubbing" as simple hard erasure of a disk for security. What kind of scrubbing does data refresh? Is there an App that does that?

And yes, while MTBF of hard disks is supposed to be about five years, they've always lasted much longer than that for me, at least for mechanical performance.
Reply With Quote
  #6  
Old 12-22-2016, 09:57 AM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
No, that's not what scrubbing does: your NAS usually will offer data integrity scrubbing.

And I'm talking about more than mechanical performance. There's a lot of "fuzziness" in magnetic drop-off, hence the ability to recover data from master tapes that are 50 years old.
__________________
--Dave Nanian
Reply With Quote
  #7  
Old 12-23-2016, 11:45 AM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Hmmm. So I guess that means that, without plunking down money for a full-up NAS system, the only option for long term HD archiving is regular (though perhaps infrequent) wholesale rewrites of the data. Seems a little odd that there isn't some refresh app for Mac that will do refreshing more conveniently, and that ideally could even be scheduled.
Reply With Quote
  #8  
Old 12-23-2016, 11:47 AM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
Doesn't seem that odd to me... this type of failure just isn't common (and who's to say the source isn't failing - what's the reference?)...
__________________
--Dave Nanian
Reply With Quote
  #9  
Old 12-23-2016, 10:40 PM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Quote:
Originally Posted by dnanian View Post
Doesn't seem that odd to me... this type of failure just isn't common (and who's to say the source isn't failing - what's the reference?)...
If this type of failure isn't considered a threat, then I have to wonder why NAS systems feature it, and people pay money for it.

I suspect it isn't common at least partly because continuously powered-up drives suffer mechanical failures long before bit rot can even happen. But for disks that are on-the-shelf, unpowered archives, bit rot is likely to be more of an issue.

A huge amount of effort has gone into strategies for long-term data archiving for petabytes of data, from heavy creators of at least science data. I understand it ain't cheap. It would be interesting to see some strategic thinking that would apply to a home user. Again it may just be a matter of archiving to multiple drives, and then, after checksum verification on one, do a wholesale write to the others every few years. If so, then a tool to do that would be handy.
Reply With Quote
  #10  
Old 12-24-2016, 07:18 AM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
Perhaps a NAS is the way to go for you, Dan? They aren't that expensive, are easy to manage, and have other means of preserving and reconstructing data as well...
__________________
--Dave Nanian
Reply With Quote
  #11  
Old 12-24-2016, 10:02 PM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Thanks, Dave. I'll think about it. But it sure would be nice to see some technical report or something on options for home users to best preserve digital data. Of course, it's been noted that the lifetime of digital formats isn't that long, so if you write to an archive with a hundred year lifetime, in fifty years no one will be able to read it. I guess the default plan has to be just to make a lot of copies onto a lot of archives.

Is there some way to check a disk for errors before copying? Is that what "First Aid" does? Does SuperDuper do this? Might be smart to make sure there are no rotten bits before one starts to copy. I guess this would be a matter of scanning the whole disk for checksum errors.
Reply With Quote
  #12  
Old 12-24-2016, 10:24 PM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
Disk Utility checks the directory for errors. It doesn't check the "disk" as such... this is generally the area where ZFS does well and is self repairing... but, ZFS isn't terribly easy to use on OS X/macOS.
__________________
--Dave Nanian
Reply With Quote
  #13  
Old 12-25-2016, 03:30 PM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Well, FWIW, I did find this interesting essay ...

https://www.osomac.com/2014/01/13/bit-rot/

which considers what a Mac user can do about bit-rot. The answer? Not a helluva lot, it seems. Now, copying to several independent archive disks doesn't really help much, because if you've got a rotten bit on your home machine I assume you're just going to faithfully copy that rotten bit to your archives.
Reply With Quote
  #14  
Old 12-26-2016, 12:29 PM
dnanian's Avatar
dnanian dnanian is offline
Administrator
 
Join Date: Apr 2001
Location: Weston, MA
Posts: 14,923
Send a message via AIM to dnanian
Again, I had to be a broken record here, but using a NAS (Synology units, for example, use BTRFS) to archive, including hardware and "bit rot" redundancy, seems like the way to go here, not a tool to run checksums on your drive.
__________________
--Dave Nanian
Reply With Quote
  #15  
Old 12-26-2016, 07:41 PM
Dan Lester Dan Lester is offline
Registered User
 
Join Date: Nov 2008
Posts: 173
Quote:
Originally Posted by dnanian View Post
Again, I had to be a broken record here, but using a NAS (Synology units, for example, use BTRFS) to archive, including hardware and "bit rot" redundancy, seems like the way to go here, not a tool to run checksums on your drive.
Thank you. You're exactly right. But NAS systems will set you back a few hundred bucks. I was just wondering if there could be a lower cost strategy for the non-business user that would ensure archival security for unchanging data. Ideally just an app that works on consumer-level drives.

For a few hundred bucks, I could just burn half a terabyte to M-disks.
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Newbie question (not seen in FAQ) Mike Nassour General 4 08-15-2009 10:40 AM
Network backup question from a newbie stablgr General 1 03-19-2009 08:32 AM
Quick cloning question Timmy General 5 12-10-2006 02:06 PM
Schedule Question jgrove General 1 11-13-2006 09:54 AM
Undersized disk image and a Boot Camp-related backup question garybollocks General 3 10-23-2006 01:07 PM


All times are GMT -4. The time now is 09:08 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2024, vBulletin Solutions, Inc.