Managing very large files

Chad Perrin perrin at apotheon.com
Thu Oct 4 13:30:20 PDT 2007


On Thu, Oct 04, 2007 at 04:16:29PM -0400, Steve Bertrand wrote:
> >> man 1 split
> >>
> >> (esp. -l)
> > 
> > That's probably the best option for a one-shot deal like this.  On the
> > other hand, Perl itself provides the ability to go through a file one
> > line at a time, so you could just read a line, operate, write a line (to
> > a new file) as needed, over and over, until you get through the whole
> > file.
> > 
> > The real problem would be reading the whole file into a variable (or even
> > multiple variables) at once.
> 
> This is what I am afraid of. Just out of curiosity, if I did try to read
> the entire file into a Perl variable all at once, would the box panic,
> or as the saying goes 'what could possibly go wrong'?

Perl will happily load stuff into RAM until you run out of RAM.  I
imagine it would then keep loading stuff into "memory", and the box would
start swapping.  Eventually, you'd run out of swap space.

Perl is known to some as the "Swiss Army chainsaw" for a reason: it'll
cut limbs off trees about as quickly as you can put limbs in front of it.
If you put one of your own limbs in front of it (say, a leg), it'll do
exactly the same thing -- but with more bleeding and screaming.

It's kinda like Unix, that way.

-- 
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Brian K. Reid: "In computer science, we stand on each other's feet."


More information about the freebsd-questions mailing list