Splitting up sets of files for archiving (e.g. to tape, optical media)

Jov amutu at amutu.com
Fri Jan 19 02:41:32 UTC 2018

dar may help:http://dar.linux.free.fr/
from the man page:

*dar* is a full featured backup tool, aimed for disks (floppy, CD-R(W),
DVD-R(W), zip, jazz, hard-disks, usb keys, etc.) and since release 2.4.0
also adapted to tapes.

*dar* can store a backup in several files (called "slices" in the
following) of a given size, eventually pausing or running a user
command/script before starting the next slice. This can allow for example,
the burning of the last generated slice on a DVD-R(W), Blue-ray Disk, or
changing of usb key before continuing on the next one. Like its
grand-brother, the great "tar" command, *dar* may also use compression, at
the difference that compression is used inside the archive to be able to
have compressed slices of the defined size.

2018-01-19 10:06 GMT+08:00 Ronald F. Guilmette <rfg at tristatelogic.com>:

> This isn't really FreeBSD specific, but in my experience the folks on
> this list have a lot of knowledge about a lot nice, useful free software
> tools, so I hope nobody will begrudgd me for asking this question here.
> I'm looking for a pre-existing software tool, which may or may not already
> exist, and which will do the following job...
> Problem statement:
> Imagine that you have a big set of files that you would like to archive
> to some sort of archiving media, such as tapes, or optical media, where
> each unit of said archiving media has a capacity considerably less than
> the total aggregate size of all of the files you want to archive.
> Imagine further that you would like your set of input files to be spread
> across the units of the output (archive) media such that no single input
> file is ever split across more than one unit of the output media, in order
> to simplify recovery/restore of individual files.
> Lastly, assume that it is desired to minimize, as much as reasonably
> possible, the total number of output (archive) media units used to
> archive the entire set of input files.  (And to further this goal,
> it is acceptable for files from any single input subdirectory to be
> scattered among the various output media units.
> +_+_+_+_+_+_+_+_+_+_
> In my case, I want to archive several hundred gigabytes onto a set of
> blank BD-R disks.
> I plan to use ImgBurn to actually write the BD-R disks.
> So basically, I just need a tool to analyze the input file set, applying
> some sort of bin packing algorithm, and then spit out a list of which
> specific files should go into each specific archive volume, e.g. #01, #02,
> #03... etc.  Each such set of files will then, in turn, be hard-linked
> into a temporary directory, and then, one by one, ImghBurn will be told
> to write each of these temp directories to a single output BD-R disk.
> I have written a small software tool to do the above "splitting" job,
> and I am currently improving upon it, but it occured to me that I
> should at least ask if someone else has perhaps already perfected this
> exact wheel that I am busy re-inventing.
> Regards,
> rfg
> P.S.  It seems unlikely that I'm the first and only person to have ever
> written a tool to do this specific job, but on the off chance that I am,
> I am more than willing to contribute my little tool to the ever-expanding
> ports tree.
> _______________________________________________
> freebsd-questions at freebsd.org mailing list
> https://lists.freebsd.org/mailman/listinfo/freebsd-questions
> To unsubscribe, send any mail to "freebsd-questions-
> unsubscribe at freebsd.org"

More information about the freebsd-questions mailing list