Backup advice

Roland Smith rsmith at xs4all.nl
Thu May 24 16:10:24 UTC 2007


On Wed, May 23, 2007 at 07:27:05PM -0400, Jason Lixfeld wrote:
>  So I feel a need to start backing up my servers.  To that end, I've decided 
>  that it's easier for me to grab an external USB drive instead of a tape.  It 

Buy at least two, and keep one off-site.

>  would seem dump/restore are the tools of choice.  My backup strategy is 
>  pretty much "I don't want to be screwed if my RAID goes away".  That said I 
>  have a few questions along those lines:
> 
>  - Most articles I've read suggest a full backup, followed by incremental 
>  backups.  Is there any real reason to adopt that format for a backup 
>  strategy like mine, or is it reasonable to just do a dump 0 nightly?  I 
>  think the only reason to do just one full backup per 'cycle' would be to 
>  preserve system resources, as I'm sure it's fairly taxing on the system 
>  during dump 0 times.

Depending on the size of your data, a level 0 dump could take a couple
of hours. Unless you have a terabyte raid array, in which case a single
USB disk probably won't cut it. :)

On the other hand, if your dataset changes rapidly you might not save
much with incremental dumps.

You can save time by setting the nodump flag on directories that contain
files that you don't really nead or can easily replace, such as
/usr/obj, /usr/ports/distfiles, /tmp et cetera.

>  - Can dump incrementally update an existing dump, or is the idea that a dump 
>  is a closed file and nothing except restore should ever touch it?

You cannot update a dump file, AFAIK.

>  - How much does running a backup through gzip actually save?  Is taxing the 
>  system to compress the dump and the extra time it takes actually worth it, 
>  assuming I have enough space on my backup drive to support a dump 0 or two?

It depends. On a normal filesystem you save about 50% with gzip. But if you have
lots of (already compressed) audio and picture data there are almost no savings.
Compressing with gzip shouldn't tax the system too much, unless it's very
old. Using bzip2 usually isn't worth it. It takes much longer and maxes
out the CPU on my 2,4 GHz athlon64.

Do not forget the -L flag if you're dumping a live filesystem!

>  - Other folks dumping to a hard drive at night?  Care to share any of your 
>  experiences/rationale?

My desktop machine's file systems are backed up every week to a USB
drive, using gzipped dumps. Every month I start with a new level 0
dump. When I run out of space I delete the oldest set of dumps.

When I nuked my /usr parition by accident I was very happy to be able to
restore things with the tools in /rescue, without first having to
rebuild a lot of ports.

Roland
-- 
R.F.Smith                                   http://www.xs4all.nl/~rsmith/
[plain text _non-HTML_ PGP/GnuPG encrypted/signed email much appreciated]
pgp: 1A2B 477F 9970 BA3C 2914  B7CE 1277 EFB0 C321 A725 (KeyID: C321A725)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 187 bytes
Desc: not available
Url : http://lists.freebsd.org/pipermail/freebsd-questions/attachments/20070524/f4fc59d9/attachment.pgp


More information about the freebsd-questions mailing list