Very large directory
David Landgren
david at landgren.net
Thu Jan 20 06:46:13 PST 2005
Peter Jeremy wrote:
> On Wed, 2005-Jan-19 21:30:53 -0600, Phillip Salzman wrote:
>
>>They've been running for a little while now - and recently we've noticed a
>>lot of disk space disappearing. Shortly after that, a simple du into our
>>/var/spool returned a not so nice error:
>>
>> du: fts_read: Cannot allocate memory
>>
>>No matter what command I run on that directory, I just don't seem to have
>>enough available resources to show the files let alone delete them (echo *,
>>ls, find, rm -rf, etc.)
>
>
> I suspect you will need to write something that uses dirent(3) to scan
> the offending directory and delete (or whatever) the files one by one.
>
> Skeleton code (in perl) would look like:
>
> chdir $some_dir or die "Can't cd $some_dir: $!";
> opendir(DIR, ".") or die "Can't opendir: $!";
> while (my $file = readdir(DIR)) {
> next if ($file eq '.' || $file eq '..');
> next if (&this_file_is_still_needed($file));
> unlink $file or warn "Unable to delete $file: $!";
> }
> closedir DIR;
similarly,
opendir(DIR, $some_dir ) or die "Can't open dir $some_dir: $!";
while ( defined(my $file = readdir(DIR))) {
print "$file\n" unless ($file eq '.' || $file eq '..');
}
closedir(DIR);
...will print the files one per line, which can be piped to more or
redirected to another file. This will let you get a feel for the names
of the files in the directory. It could then be cleaned up with various
pipelines like egrep 'foo|bar|rat' | xargs rm
Note: you want to wrap the my $file = readdir(DIR) in a defined(),
otherwise your loop will exit early if you come across a file named 0
(zero).
David
More information about the freebsd-stable
mailing list