Directories with 2million files

masta diz at linuxpowered.com
Wed Apr 21 13:46:58 PDT 2004


Garance A Drosihn wrote:

> At 3:09 PM -0500 4/21/04, Eric Anderson wrote:
>
>> Garance A Drosihn wrote:
>>
>> I suppose this is one of those "who needs files bigger than 2gb?"
>> things..
>
>
> Perhaps, but as a general rule we'd like our system utilities to
> at least *work* in extreme situations.  This is something I'd
> love to dig into if I had the time, but I'm not sure I have the
> time right now.
>
I'm not sure how we can improve this situation. Considering that an `ls 
-l` is forced to stat every file, and store that info until the time 
comes to dump it to the tty for the human operator. The problem seems 
somewhat geometric, and un-fixable unless you want to find a way to page 
out the stat information of each file to a dump file of some sort, then 
cat that info back to the operator upon conclusion of the main loop. 
Even then, list 2 million files will be excesive just storing the file 
names for display.

-Jon


More information about the freebsd-current mailing list