Managing very large files
Steve Bertrand
iaccounts at ibctech.ca
Thu Oct 4 05:43:33 PDT 2007
Hi all,
I've got a 28GB tcpdump capture file that I need to (hopefully) break
down into a series of 100,000k lines or so, hopefully without the need
of reading the entire file all at once.
I need to run a few Perl processes on the data in the file, but AFAICT,
doing so on the entire original file is asking for trouble.
Is there any way to accomplish this, preferably with the ability to
incrementally name each newly created file?
TIA,
Steve
More information about the freebsd-questions
mailing list