Backup/Restore
Richard Lynch
ceo at l-i-e.com
Thu Sep 30 13:59:02 PDT 2004
Brian McCann wrote:
> Hi all...I'm having a conceptual problem I can't get around and
> was hoping someone can change my focus here. I've been backing up
> roughly 6-8 million small files (roughly 2-4k each) using dump, but
> restores take forever due to the huge number of files and directories.
> Luckily, I haven't had to restore for an emergency yet...but if I
> need to, I'm kinda stuck. I've looked at distributed file systems
> like CODA, but the number of files I have to deal with will make it
> choke. Can anyone offer any suggestions? I've pondered running
> rsync, but am very worried about how long that will take...
Do the files change a lot, or is it more like a few files added/changed
every day, and the bulk don't change?
If it's the latter, you could maybe get best performance from something
like Subversion (a CVS derivative).
Though I suspect rsync would also do well in that case.
If a ton of those files are changing all the time, try doing a test on
creating a tarball and then backing up the tarball. That may be a simple
managable solution. There are probably other more complex solutions of
which I am ignorant :-)
--
Like Music?
http://l-i-e.com/artists.htm
More information about the freebsd-questions
mailing list