Directories with 2million files
Eric Anderson
anderson at centtech.com
Wed Apr 21 08:11:17 PDT 2004
Tim Robbins wrote:
>On Wed, Apr 21, 2004 at 08:42:53AM -0500, Eric Anderson wrote:
>
>
>
>>First, let me say that I am impressed (but not shocked) - FreeBSD
>>quietly handled my building of a directory with 2055476 files in it.
>>I'm not sure if there is a limit to this number, but at least we know it
>>works to 2million. I'm running 5.2.1-RELEASE.
>>
>>However, several tools seem to choke on that many files - mainly ls and
>>du. Find works just fine. Here's what my directory looks like (from
>>the parent):
>>
>>drwxr-xr-x 2 anderson anderson 50919936 Apr 21 08:25 data
>>
>>and when I cd into that directory, and do an ls:
>>
>>$ ls -al | wc -l
>>ls: fts_read: Cannot allocate memory
>> 0
>>
>>
>
>The problem here is likely to be that ls is trying to store all the
>filenames in memory in order to sort them. Try using the -f option
>to disable sorting. If you really do need a sorted list of filenames,
>pipe the output through 'sort'.
>
>
Doing 'ls -f' works, but still manages to munch up about 260MB of ram,
which runs since I have enough, but otherwise would not. An ls -alf
does not work (I assume because it is trying to sum the total bytes by
all files, prior to printing the data). I just noticed that find also
eats up the same amount of memory before it prints the list.
This perl script does it in about 2.5 seconds, with minimal memory:
opendir(INDEX_PATH,"./");
while ($file = readdir(INDEX_PATH)) {
$count++;
}
print "$count\n";
Eric
--
------------------------------------------------------------------
Eric Anderson Sr. Systems Administrator Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
More information about the freebsd-current
mailing list