Directories with 2million files

Eric Anderson anderson at centtech.com
Wed Apr 21 13:51:52 PDT 2004


masta wrote:

> Garance A Drosihn wrote:
>
>> At 3:09 PM -0500 4/21/04, Eric Anderson wrote:
>>
>>> Garance A Drosihn wrote:
>>>
>>> I suppose this is one of those "who needs files bigger than 2gb?"
>>> things..
>>
>>
>>
>> Perhaps, but as a general rule we'd like our system utilities to
>> at least *work* in extreme situations.  This is something I'd
>> love to dig into if I had the time, but I'm not sure I have the
>> time right now.
>>
> I'm not sure how we can improve this situation. Considering that an 
> `ls -l` is forced to stat every file, and store that info until the 
> time comes to dump it to the tty for the human operator. The problem 
> seems somewhat geometric, and un-fixable unless you want to find a way 
> to page out the stat information of each file to a dump file of some 
> sort, then cat that info back to the operator upon conclusion of the 
> main loop. Even then, list 2 million files will be excesive just 
> storing the file names for display.


Bare minimum - du should work, if you ask me.  ls is almost a separate 
issue - the only time you need to 'ls' a directory with that many files 
is maybe if you needed to use them in a script I suppose.  I did it out 
of curiousity mostly, but du is an essential tool in this case..

Eric

-- 
------------------------------------------------------------------
Eric Anderson     Sr. Systems Administrator    Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------



More information about the freebsd-current mailing list