Managing very large files
jorn at wcborstel.com
Fri Oct 5 01:26:57 PDT 2007
Steve Bertrand wrote:
>>> man 1 split
>>> (esp. -l)
>> That's probably the best option for a one-shot deal like this. On the
>> other hand, Perl itself provides the ability to go through a file one
>> line at a time, so you could just read a line, operate, write a line (to
>> a new file) as needed, over and over, until you get through the whole
>> The real problem would be reading the whole file into a variable (or even
>> multiple variables) at once.
> This is what I am afraid of. Just out of curiosity, if I did try to read
> the entire file into a Perl variable all at once, would the box panic,
> or as the saying goes 'what could possibly go wrong'?
Check out Tie::File on CPAN. This Perl module treats every line in a
file as an array element, and the array element is loaded into memory
when it's being requested. In other words: This will work great with
huge files such as these, as not the entire file is loaded into memory
> freebsd-questions at freebsd.org mailing list
> To unsubscribe, send any mail to "freebsd-questions-unsubscribe at freebsd.org"
More information about the freebsd-questions