Max. number of opened files, efficiency

Laszlo Nagy gandalf at shopzeus.com
Wed Aug 13 14:27:47 UTC 2008


How many files can I open under FreeBSD, at the same time?

Problem: I'm making a pivot table, and when I drill down the facts, I 
would like to create a new temporary file for each possible dimension 
value. In most cases, there will be less than 1000 dimension values. I 
tried to open 1000 temporary files and I could do so within one second.

But how efficient is that? What happens when I open 1000 temporary 
files, and write data into them randomly, 10 million times. (avg. 10 000 
write operations per file) Will this be handled efficiently by the OS? 
Is efficiency affected by the underlying filesystem?

I also tried to create 10 000 temporary files, but performance dropped down.

Example in Python:

import tempfile
import time
N = 10000
start = time.time()
files = [ tempfile.TemporaryFile() for i in range(N)]
stop = time.time()
print "created %s files/second" % ( int(N/(stop-start)) )

On my computer this program prints "3814 files/second" for N=1000, and  
"1561 files/second" for N=10000.

Thanks,

   Laszlo



More information about the freebsd-questions mailing list