Very large directory

David Landgren david at landgren.net
Thu Jan 20 06:47:09 PST 2005


Oliver Fromme wrote:
> Peter Jeremy <PeterJeremy at optushome.com.au> wrote:
>  > On Wed, 2005-Jan-19 21:30:53 -0600, Phillip Salzman wrote:
>  > > They've been running for a little while now - and recently we've noticed a
>  > > lot of disk space disappearing.  Shortly after that, a simple du into our
>  > > /var/spool returned a not so nice error:
>  > > 
>  > >       du: fts_read: Cannot allocate memory
>  > > 
>  > > No matter what command I run on that directory, I just don't seem to have
>  > > enough available resources  to show the files let alone delete them (echo *,
>  > > ls, find, rm -rf, etc.)
>  > 
>  > I suspect you will need to write something that uses dirent(3) to scan
>  > the offending directory and delete (or whatever) the files one by one.
>  > 
>  > Skeleton code (in perl) would look like:
>  > [...]
> 
> I would suggest trying this simple hack:
> 
> cd /var/spool/directory ; cat . | strings | xargs rm -f
> 
> It's a dirty hack, but might work, if the file names in
> that directory aren't too strange (no spaces etc.).

why suggest a dirty hack that might not work, when the proposed Perl 
script would have worked perfectly?

David



More information about the freebsd-stable mailing list