Managing very large files

Chad Perrin perrin at apotheon.com
Thu Oct 4 13:31:33 PDT 2007


On Thu, Oct 04, 2007 at 04:25:18PM -0400, Steve Bertrand wrote:
> Heiko Wundram (Beenic) wrote:
> > Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand:
> >> This is what I am afraid of. Just out of curiosity, if I did try to read
> >> the entire file into a Perl variable all at once, would the box panic,
> >> or as the saying goes 'what could possibly go wrong'?
> > 
> > Perl most certainly wouldn't make the box panic (at least I hope so :-)), but 
> > would barf and quit at some point in time when it can't allocate any more 
> > memory (because all memory is in use). Meanwhile, your swap would've filled 
> > up completely, and your box would've become totally unresponsive, which goes 
> > away instantly the second Perl is dead/quits.
> > 
> > Try it. ;-) (at your own risk)
> 
> LOL, on a production box?...nope.
> 
> Hence why I asked here, probing if someone has made this mistake before
> I do ;)
> 
> The reason for the massive file size was my haste in running out of the
> office on Friday and forgetting to kill the tcpdump process before the
> weekend began.

Sounds like you may want a Perl script to automate managing your
tcpdumps.

Just a thought.

-- 
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Kent Beck: "I always knew that one day Smalltalk would replace Java.  I
just didn't know it would be called Ruby."


More information about the freebsd-questions mailing list