I just had to download 16gb from an unreliable server, which would time out, terminate mid connection, and had invalid files and directories, my experiences are thus:
My day to day FTP program on windows. Filezilla could not even queue all of the files on the server without errors. It was unable to cope with the invalid directories. There was no clear indication as to what was q'd and what wasnt. It wasnt possible to disable the q while I manually selected directories to skip around the broken ones.
Having given up on Filezilla I decided to try my Mac ( prompted by Gruber's high opinion of interarchy ).
Couldnt really figure out the interface, but I managed to get it downloading. It regularly ran into time outs and needed manual intervention every time ( not nice ). I miss typed my initial password, and even though I check "add to keychain" it asked for my password every time I retried a failed file. When it retried I couldnt tell if it was starting again from the beginning of the q or not. I eventually gave up.
Q'd up the files, but its interface doesnt fit on my small monitor. Was very hard to see what was going on. I think it got stuck on a file while q'ing, but the UI for managing the q qas off of the screen.
Works perfectly. Q's up what you initially drag onto it ( it doesnt try to recurse through the whole dir structure to q it ). Then works through each one. Whenever it got timeouts and disconnects it marked the file as failed in the q and moved on. When it hit the invalid directories it did the same. I ended up with a list of about 40 failed transfers ( it retries files that timed out ), which I could manually reset and try again. Very nice. Went through the whole site without requiring intervention untill the end, which was good, it took 3 days.