Need a reliable backup solution

Posted:
in Mac Software edited January 2014
Hi, all. I'm a big fan of .mac but Backup 2.1 doesn't do it for me. It's flat out unreliable. I've tried all sorts of hocus-pocus to get it to do automatic backups but it only works intermittently. That's not good enough. So I'm in need of a reliable backup program. One I can trust to do what it needs to do.



I've tried CCC but it seems that development has stopped and the version of psync that works with Panther isn't entirely compatible with it. What I'd like is incremental backups of my entire hard drive. Failing something like that, I'd like at least schedulable backups to my external HD. Any recommendations? TIA.

Comments

  • Reply 1 of 19
    gabidgabid Posts: 477member
    Quote:

    Originally posted by torifile

    Hi, all. I'm a big fan of .mac but Backup 2.1 doesn't do it for me. It's flat out unreliable. I've tried all sorts of hocus-pocus to get it to do automatic backups but it only works intermittently. That's not good enough. So I'm in need of a reliable backup program. One I can trust to do what it needs to do.



    I've tried CCC but it seems that development has stopped and the version of psync that works with Panther isn't entirely compatible with it. What I'd like is incremental backups of my entire hard drive. Failing something like that, I'd like at least schedulable backups to my external HD. Any recommendations? TIA.




    Sorry I don't have any suggestions (I'm looking forward to what other have to say though), but I'm wondering what specific problems you've been having with Backup. What media have you been trying to backup to? I've been using Backup to automatically save to my iDisk and it works without a hitch every night. Though if there is some media that Backup doesn't play nice with, I'd very much like to know because I'll be having to do an even greater amount of backing up shortly (re: me actually sitting down to pen my dissertation).
  • Reply 2 of 19
    torifiletorifile Posts: 4,024member
    Quote:

    Originally posted by Gabid

    Sorry I don't have any suggestions (I'm looking forward to what other have to say though), but I'm wondering what specific problems you've been having with Backup. What media have you been trying to backup to? I've been using Backup to automatically save to my iDisk and it works without a hitch every night. Though if there is some media that Backup doesn't play nice with, I'd very much like to know because I'll be having to do an even greater amount of backing up shortly (re: me actually sitting down to pen my dissertation).



    When backup 2.1 first came out, it seemed to do a good job backing up every night. That was just to another partition on my internal hard drive. I've since gotten an external HD and it just won't backup automatically unless I manually set it for a time later. And it won't do it more than once. I've got to open it and change the schedule for it to work. :/



    I'm using a maxtor drive in a macally firewire enclosure. If I backup manually, it works without a hitch, but that's not good enough. I, too, am working on a dissertation and would HATE to lose even a single word of it. It's that good. Just kidding, but I don't want to have to deal with data loss if I can prevent it.
  • Reply 3 of 19
    gabidgabid Posts: 477member
    Quote:

    Originally posted by torifile

    I, too, am working on a dissertation and would HATE to lose even a single word of it. It's that good.



    Isn't that what we all say



    Seriously though, thanks for the heads up. Good info to know since the chances of me acquiring a second HD of some sort is out outside the realm of possibility. Now I'm waiting for suggestions just like you are...
  • Reply 4 of 19
    kickahakickaha Posts: 8,760member
    rsync is your buddy.



    Get the RsyncX distribution from http://www.macosxlabs.org/rsyncx/rsyncx.html and install the puppy. The latest version has all *SORTS* of neat things in it (like incremental backups that look like full copies, but only take up delta space + 1 copy...). It can be a PITA to set up if you're not comfy with the command line at all, but their latest GUI tool isn't too bad, if it does look like the dashboard of a 777. (rsync just really does have that many switches and options.)



    This AI mod trusts his dissertation to it, shouldn't you?
  • Reply 5 of 19
    torifiletorifile Posts: 4,024member
    Quote:

    Originally posted by Kickaha

    rsync is your buddy.



    Get the RsyncX distribution from http://www.macosxlabs.org/rsyncx/rsyncx.html and install the puppy. The latest version has all *SORTS* of neat things in it (like incremental backups that look like full copies, but only take up delta space + 1 copy...). It can be a PITA to set up if you're not comfy with the command line at all, but their latest GUI tool isn't too bad, if it does look like the dashboard of a 777. (rsync just really does have that many switches and options.)



    This AI mod trusts his dissertation to it, shouldn't you?




    I knew you'd be chiming in soon enough. I've got it and I'm doing my first run of it now. You're right - it does seem a little complicated. Could you point me in the right direction about how to make a script that will automate backup of my home directory? It seems to be more for synchronizing over a network (my next project - backing up my data in a reliable way because the text of my dissertation will mean nothing if I don't have data to back it up, pun intended). TIA.
  • Reply 6 of 19
    kickahakickaha Posts: 8,760member
    Well, (and I initiate this conversation right before going to bed, so I may not get back to you until the am) what kind of backing up do you want to do?



    Mirror?



    Incremental deltas?



    Assume that you just want mirroring, the simplest. Anything deleted on the source gets deleted off the backup, etc.



    rsync --eahfs --archive --relative /Users/torifile /Volumes/BackupDisk/



    This will create a '/Users/torifile' directory on the BackupDisk if it doesn't already exist (the relative flag), backup everything including permissions, symlinks, etc (archive flag), with HFS++ resource forks and metadata (eahfs).



    That's pretty much it.



    Wrap that in your favorite cron job script or use one of the task schedulers out there, and voila.



    --showtogo will give you a readout of how much is left to go (in # of files, *not* bytes or seconds... so if you hit some big files, it'll look like it's bogged down) which I like for over network connections.



    The nice thing about this is, that if you end up with a server you have access to, the above command is trivially changed to access the server instead of a mounted disk. Voila. Backup to the server of your choice, from anywhere on the net.



    rsync newbie gotcha alert: do *not* put a trailing slash after /Users/torifile (making it /Users/torifile/) - this is a completely different command. It says 'back up the *contents* of /Users/torifile/' while without the trailing slash it says 'back up the *directory* /Users/torifile'. Subtle, but it can be nasty later when restoring.



    Which, btw, the easy. But that's for morning.
  • Reply 7 of 19
  • Reply 8 of 19
    kickahakickaha Posts: 8,760member
    Not worth the money, unless you have a server farm.



    1) You can't read your own files without the application.



    2) The UI is nearly as bad as RsyncX's GUI front end.



    3) Dantz has utterly dropped the ball on customer support.



    4) Upgrade? What's an upgrade?



    I used to be a Retrospect user, but there are so many cheap/free options under MacOS X that Retrospect simply no longer makes good sense for the individual user.
  • Reply 9 of 19
    staphbabystaphbaby Posts: 353member
    Quote:

    Originally posted by spotcatbug

    Retrospect



    Retrospect? *chuckle* *guffaw* [runs screaming into the night]



    Seriously though, I use this everyday (Retrospect WorkGroup). It's alright, but Dantz have this amazing ability to (1) not fix bugs (2) charge for the fixes to the bugs which are so big that they completely break functionality (vide 5.5 ?> 6.0 "upgrade" in Panther).



    Another avenue to investigate is Mike Bombich (the guy who does CCC)'s website ? he gives extremely detailed instruction as to what you need to do on the command line to back up data safely... looking at that might give you a bit more granularity.



    Beyond that: rsync all the way, says I. How can I not? It was invented by a Canberran...
  • Reply 10 of 19
    othelloothello Posts: 1,054member
    i use file synchronization to backup 2 laptops to a server, then the server to a maxtor firewire drive. works a charm...



    http://nemesys2.dyndns.org:8080/File...zation_EN.html
  • Reply 11 of 19
    torifiletorifile Posts: 4,024member
    Quote:

    Originally posted by Kickaha

    Assume that you just want mirroring, the simplest. Anything deleted on the source gets deleted off the backup, etc.



    rsync --eahfs --archive --relative /Users/torifile /Volumes/BackupDisk/



    This will create a '/Users/torifile' directory on the BackupDisk if it doesn't already exist (the relative flag), backup everything including permissions, symlinks, etc (archive flag), with HFS++ resource forks and metadata (eahfs).



    That's pretty much it.



    Wrap that in your favorite cron job script or use one of the task schedulers out there, and voila.





    Ok, one would think that I could do this simple thing, but cron scripts have me absolutely baffled. I've tried using Cronnix to automate the process, but that seemed to not work. I'll look up the man pages on cron to see what I need to do, but I may need some more help. (I'll try to figure it out on my own first)



    In the meantime, could you fill me in on what an "incremental delta" is? TIA.
  • Reply 12 of 19
    kickahakickaha Posts: 8,760member
    An incremental backup is one that only backs up the things that have changed, instead of the whole shebang each time. Saves a lot of time.



    An incremental delta backup lets you have multiple copies of the backup, like one every 24 hrs, without using many times the storage. Say you have a weeks worth of backups. Normally this would take 7 times the space used by one, but a delta sets up seven backups that each *look* like a full backup, but with a little smoke and mirrors (Unix hardlinks), it turns out that the amount of space you need is the original + the size of all the files that changed. That's it.



    Tis very cool, if you have enough space to do it, and a little bit of familiarity with rsync. (And no, I haven't had a chance to implement this myself yet, so take a peek at the --backup option, I believe that's the new flag to do this internal to rsync instead of the old ln method outlined on several web pages regarding rsync.)
  • Reply 13 of 19
    noleli2noleli2 Posts: 129member
    I currently use CCC with Psync to backup my internal drive to a partition on my external. And I just downloaded PsyncX, and I think I'll try that out.



    But how does Rsync differ from Psync? Do they both do basically the same thing - incremental backups?
  • Reply 14 of 19
    kickahakickaha Posts: 8,760member
    Psync is a Perl-based backup system that I've heard good things about.



    Rsync is a *VERY* feature-rich backup system that can be scaled from one-shot mirroring for personal computers to incremental delta archiving for massive server farms across the internet, with compression and encryption.



    I do backups to the departmental servers from time to time to keep a couple of CVS repositories in sync, and rsync fits my needs well... as I recall, when I looked into psync, it wasn't then quite as straightfoward to set up as rsync for what I needed.



    BTW, there *is* a default rsync installed with MacOS X, but *do not use it*. It *does not* handle HFS+ metadata or resource forks, it is the base, vanilla, rsync distribution one would find in a BSD or Linux system. The rsyncx project I linked to is that same program, but they added HFS+ support (via the eahfs flag - it used to be assumed, but now since the flag is *required*, if you are attempting to run a MacOS X backup script and encounter a non-HFS+ aware copy of rsync, it'll barf saying it doesn't know anything about the flag, instead of silently just not handling the extra info).



    There's a big push to get HFS+ support into the vanilla rsync codebase, but the maintainers have said that they'd like to put in place a mechanism for metadata in general, not just HFS+, so they're holding off for now.
  • Reply 15 of 19
    torifiletorifile Posts: 4,024member
    Thanks for the pointers, kickaha! I got my cron script working and at first it happened so fast (I already had a mostly current copy of my documents folder on my target) that I thought something was wrong. To be sure it was working, I created another directory for my backup and now it's working great.



    Cheers!
  • Reply 16 of 19
    kickahakickaha Posts: 8,760member
    You mean it actually *WORKED*?



    Cool. It was either that, or it was going to wipe your drive...
  • Reply 17 of 19
    maniamania Posts: 104member
    here is a bash script to backup to your spare hard drive. stick it in /etc/periodic/daily and make it executable. ditto doesn't delete files that you delete so you gotta clean it up now and then. you could also use xtar instead of ditto if you want some compression.



    #!/bin/bash



    # backup User dir



    # with ditto

    if [ -e /Volumes/your_extra_drive_name ]; then ditto -rsrcFork /Users /Volumes/your_extra_drive_name/Users; fi
  • Reply 18 of 19
    torifiletorifile Posts: 4,024member
    Quote:

    Originally posted by Kickaha

    You mean it actually *WORKED*?



    Cool. It was either that, or it was going to wipe your drive...




    Damn it, it worked too well. I had my backup going to my external HD. It was all good if my HD was plugged in at the time my cron ran, but tonight, I didn't get a chance to plug it in. What happened? Well, it created a /Volumes/Users/torifile for me and proceeded to backup my entire home directory onto my already cramped HD. Needless to say, I started to run out of space. :/



    I tried to reboot but my HD was too full to work properly. Luckily I had a working install on my external and was able to navigate to the offending folder and get rid of it.



    I guess when you say "if the folder doesn't exist, it'll get created" you really mean it. Is there any way to keep it from running if my external isn't plugged in?
  • Reply 19 of 19
    staphbabystaphbaby Posts: 353member
    Quote:

    Originally posted by torifile

    Damn it, it worked too well. I had my backup going to my external HD. It was all good if my HD was plugged in at the time my cron ran, but tonight, I didn't get a chance to plug it in. What happened? Well, it created a /Volumes/Users/torifile for me and proceeded to backup my entire home directory onto my already cramped HD. Needless to say, I started to run out of space. :/



    I tried to reboot but my HD was too full to work properly. Luckily I had a working install on my external and was able to navigate to the offending folder and get rid of it.



    I guess when you say "if the folder doesn't exist, it'll get created" you really mean it. Is there any way to keep it from running if my external isn't plugged in?




    You could script something like the following:



    Code:




    #!/bin/bash



    if ( test -e /Volumes/foo/ ); #test if the directory is currently there

    then ( rsync? ; );

    else ( echo "Volume not mounted! Rsync not done!" >> myrsync.log; );

    fi









    I have a bit of a blindspot for shell punctuation, so you may have to add/remove semi-colons here and there



    edit: lookie there, mania beat me to the punch!
Sign In or Register to comment.