Managing very large files
fbsd.questions at rachie.is-a-geek.net
Thu Oct 4 13:38:01 PDT 2007
On Thursday 04 October 2007 22:16:29 Steve Bertrand wrote:
> >> man 1 split
> >> (esp. -l)
> > That's probably the best option for a one-shot deal like this. On the
> > other hand, Perl itself provides the ability to go through a file one
> > line at a time, so you could just read a line, operate, write a line (to
> > a new file) as needed, over and over, until you get through the whole
> > file.
> > The real problem would be reading the whole file into a variable (or even
> > multiple variables) at once.
> This is what I am afraid of. Just out of curiosity, if I did try to read
> the entire file into a Perl variable all at once, would the box panic,
> or as the saying goes 'what could possibly go wrong'?
There's probably a reason why you want to process that file - splitting it can
be a problem if you need to keep track of some states and it splits on the
wrong line. So, I'd probably open it in perl (or whatever processor) directly
and use a database for storage if I really need to keep string contexts, so
that on each line iteration my perl memory is clean.
More information about the freebsd-questions